You’re Not Getting What You Paid for.

Cameron Lutz
The Startup
Published in
19 min readFeb 27, 2021

How data and deep learning are transforming products and services.

Source: Refik Anadol

In God we trust, all others bring data. -William Edwards Deming

Big Data promises tremendous transformations in education, healthcare, climatology, insurance, transportation, and the list goes on. The anxiety of emerging artificial intelligence replacing your role in society is a shared sentiment among leading AI experts. According to a 2019 Brookings Institution report, about 25 percent of US jobs (36m) will face high exposure to automation within the decade. But there is reason to be optimistic in this strange new world. We are drowning in data, but thirsting for wisdom.

Examples are drawn from:

(1) The Ends Game: How Smart Companies Stop Selling Products and Start Delivering Value by Marketing Professor at ESADE Business School in Barcelona, Marco Bertini and Deputy Dean at London Business School, Oded Koenigsberg.

(2) Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again by world-renowned cardiologist and Executive Vice President of Scripps Research, Eric Topol.

Section I | What you’ve come to expect and how we got here.

The first industrial revolution that saw the widespread mechanization of the production of goods. Exemplified best by Henry Ford; the product that was once a niche product for the wealthy then became one of ubiquity with a price “so low that no man will be unable to own one.”

With the means of production squared away, companies focused on distribution. The go-to-market strategies used to mean physically walking to a market where buyers might be willing to buy what you made. Founded in 1893, Sears initially sold one category of product: watches. Despite not opening a brick-and-mortar presence until 1925, Sears figured around the turn of the century: “We are able by reason of our enormous output of good to make contracts with representative manufacturers and importers for such large quantities of merchandise that we can secure the lowest possible prices.” Scale was how Sears taught Americans how to shop.

Steadfast production and distribution efficiency for decades led to another diversion of focus for organizations: customer attention. In an effort to nurture demand in a crowded marketplace, mass advertising was born. This evolution coincided with technological innovations like the phonograph, film, and radio that saw the standardization of information. Advertising is a piece of the budgetary pie that companies prefer not to waste. The early direct mail companies knew that guesswork was fatally expensive. Peter Drucker once made the example: “True marketing starts out the way Sears starts out — with the customer, his demographics, his realities, his needs, his values. [Sears] does not ask, ‘What do we want to sell?’ It asks, ‘What does the customer want to buy?’ It says, ‘These are the satisfactions the customer looks for, values, and needs.’”

Then came the innocuous invention of the price tag, replacing the standard practice of haggling. This presented the challenge that spawned modern pricing theory: what percentage of the customer population is willing to buy this product at this particular price?

Just as the price of products can be poor proxies of value, in the realm of medicine, a similar concern arises. Despite ranking highest in the world in healthcare per capita, our lackluster outcomes beg explanation.

Shallow Medicine

Patients exist in a world of insufficient data, insufficient time, insufficient context, and insufficient presence. Or, as I say, a world of shallow medicine.

A review of three very large studies concluded that there are about 12 million misdiagnosis in America each year. These can be attributable to a number of factors: failing to order the correct test, misinterpreting the test that was performed, generating a poor differential diagnosis, and/or missing an abnormal finding. Misdiagnosis is bad enough, but worse when it leads to mistreatment—up to one-third of medical operations are performed unnecessarily.

This should come as no surprise. The average length of a clinic visit in the United States is seven minutes; twelve minutes for new patients. What’s more, the emergence of electronic health records has limited eye contact between patient and doctor. Attending to the keyboard, instead of the patient, is cited at the main cause of burnout and depression. But these health records often contain incomplete or inaccurate information. Moving these records between health systems is often challenging due to proprietary software and file formats meant to discourage churn.

The United States, when compared to the OECD, performs “distinctly worse” in longevity, infant/childhood mortality, and alarmingly high rates of maternal mortality, especially among black women. All while we command the lead of health expenditure per capita. A prime example of American Exceptionalism.

There are more than 12 million serious diagnostic errors in the United States alone, as found by a 2015 National Academy of Sciences report. Even the most diligent doctors cannot be familiar with every illness and drug, every article in every medical journal. Doctors are sometimes sick, hungry, or tired; all of which affects their judgement and yields err in their diagnosis.

Understanding the biases that our healthcare providers are susceptible to can inform our service-level interactions and push for better outcomes, three of which I’d like to discuss. When making a diagnosis based on the fraction of possible realities that are mentally “available” to your doctor at any given moment is called availability bias. Another common bias among physicians is overconfidence. Overconfidence in the various number of heuristics and expert opinions that clinicians use in their diagnoses to deal with uncertainty is sometimes referred to as eminence-based medicine. The last type of bias is one we are all familiar with — confirmation bias, “seek and you shall find.” This is the tendency to embrace information that confirms your beliefs and eschew that which contradicts. Computer augmentation is something we should consider a good thing, a filter for our subjective human errors.

At the end of the day, computers are built by humans. Software does not escape the inheritance of our biases and prejudice. AI is the modern vehicle for systematic oppression as algorithms are not only trained on historical datasets (as if the past is something to replicate) but they then use these inequality-ridden datasets to make predictions about the future: who is likely to default on a loan, who is creditworthy, who is qualified for a mortgage. The institutions which presently discriminate against racial minorities would be adopting these technologies unknowingly exacerbating the further stratification of our society. This also presents a real opportunity to right our wrongs on a scale never before seen in modern history.

Bias in medical research is no exception. Minorities are frequently underrepresented in studies and sometimes omitted entirely. This is a real problem in genomics research for two reasons: (1) people of European ancestry compose most or all of the subjects in large cohort studies and are thus (2) of limited value to most people because much of the genomics of disease and health is ancestry specific. But, if gotten right, we stand to greatly improve outcomes for which our children will be grateful.

Section II | How data can help you get your money’s worth.

People don’t want to buy a quarter-inch drill, they want a quarter-inch hole.

There is plenty of waste in the economy, as well as healthcare. That which gets measured, gets managed. There are three types of waste in commerce: access waste, consumption waste, and performance waste. Here, we dive into the causes of waste and how we can leverage information pipelines for a more lean economy.

Access waste is equivalent to saying, “Customers can’t get it” because of some physical or financial constraint. Financial constraints can occur when an individual cannot afford a single large expense or many smaller ones, or a firm lacks the necessary capital for a piece of equipment. Physical access waste can occur when reaching a specific solution is effortful, time-consuming, or just plain inconvenient.

Stockouts are a form of physical access waste we are all familiar with; we run out of a particular product just when we need it most thanks to inadequate planning. Gillette seemed to enjoy it’s dominant market share under the deceptively simple model for it’s razor and blades business: “add features and raise prices.” This robust revenue stream was thanks to a feature old commerce: shelf space was the only way to gain exposure to the customer. A 2013 Financial Times report found that since 1990, the price of Gillette blade cartridges, adjusted for inflation, rose 236% or 5.5% annually every single year for twenty-three years. Disgruntled consumers were relieved to find Dollar Shave Club offering an effortless subscription model that solved the two forms of access waste: a few dollars a month to have replacement razor blades shipped regularly to their door. Consumers found that avoiding stockouts and price gouging was in fact, “the best a man could get.” This direct to consumer model generated valuable insights like frequency of consumption and perceived quality of shave, so much so Unilever acquired Dollar Shave Club for $1 billion for their “unique consumer and data insights.”

Another form of physical access waste is the unwanted accumulation of idle assets, such a common problem that the self-storage industry is now a $38 billion dollar business. Cocktail dresses and evening gowns “sit idle” more often than cars do. Despite serving a valuable purpose, many are expensive and rarely worn. Rent The Runway presented the value proposition that allows women to select clothes online and opt for a one-time rental or a $69/month membership to receive up to four designer pieces at once, with free insurance, dry cleaning, and shipping.

An example of financial access waste occurs when the revenue model prevents customers from achieving the variety of consumption they desire. Recorded music is an industry that encountered this firsthand. Building a physical music library prior to digital age could become prohibitively expensive, especially when you’re required to purchase an album with only one song on it that you’d actually listen to. Being able to listen to any song you want, where and whenever you want might’ve been impossible, no matter how deep your pockets. The emergence of Napster, iTunes, and Apple Music is a reflection of the market adjusting to evolving technologies in an effort to reduce physical and financial access waste.

Financial constraints can lead to forgoing a purchase, trading down on quality, or finding a workaround when customers worry a particular offering would be a misallocation of their resources. Automakers, for example, could improve access to their products by lowering the purchase price or offering promotions. This would work in the short term, but prove unsustainable. General Motors briefly offered the Book by Cadillac subscription: for $1500 per month, members could step in and out of any their ten models up to eighteen times a year without additional charges. Temporary “ownership” allows users to swap cars as their personal needs change.

The common denominator with these examples of reducing access waste is trading traditional ownership of a product for access on a periodic basis, otherwise known Everything as a Service (XaaS). Organizations can now lower the barrier of entry into a market by turning almost any good into a “service” with enhanced convenience.

If we can solve access waste, there is still consumption waste. Consumption waste is equivalent to saying “Customers don’t or can’t use it.” This can happen when a consumer is required to buy a larger quantity than they actually need, or a single asset goes underutilized. There are three main methods of combating consumption waste: unbundling, metering and, sharing.

Unbundling: Have you ever purchased a premium cable package and only ever cycled through the same five channels? According to a 2019 Consumer Reports, the median cable service bundle in the US was $173 per month. This type of consumption waste occurs when consumers are forced to buy a “quantity” that does not correspond to their actual needs. The mass migration from television to streaming was a result of consumers aligning their spending with the programming that best suits their entertainment needs. When a section of industry transitions from selling physical goods to digital ones, like the music or newspaper industry, constraints change, as do economics. Marc Andreessen said, “Bundles emerge as a consequence of the current technology.”

Metering: High technology is not the only industry shifting their businesses toward efficiency, but the tried-and-true tire industry giant Michelin also made an offering under their commercial fleet management unit, Michelin Solutions. They bring the tires, you pay by miles driven. Transitioning to a service provide unlocks insights into consumption behaviors that cannot be found in a single exchange of money for rubber. A report from the World Economic Forum says that Michelin’s shift from “selling tires as a product to a service” has help the company “achieve higher customer satisfaction, increased loyalty and raised EBITDA margins.”

Sharing: Collaborative consumption, aka sharing, can be applied to any market where there is a significant amount of customers who own things that they aren’t fully using and there is a equally significant amount who need that exact product or service, but cannot afford to purchase it. Airbnb and Uber have become shorthands for the sharing economy, exemplifying how idle assets can be made productive and how value can be created by generating network effects between asset owners and asset users.

Lastly, there is performance waste: when the product or service doesn’t deliver the value that is expected from it. These pay-for-performance models are like the traditional ownership model, except with a money-back guarantee.

Teatrenue, a popular comedy theatre in Barcelona, Spain, sought ways to stay afloat in 2013 when the tax rate skyrocketed from 8 percent to 21 percent. Comedy shows are ticket-sellers, not necessarily laugh-sellers. Realizing this, a deployment of facial recognition hardware was installed in front of each seat to find out what exactly was so funny. Each laugh was priced at 30 Euro cents, and Teatrenue set the maximum charge at 24 Euros, equivalent to 80 laughs. Those who had no fun at all, paid nothing; while the tear-wellers got their money’s worth. Albeit a rough proxy to measure entertainment — penny-pinchers have been seen suppressing laughter — it aligns the goals of both the venue and the showgoers by tethering the financial feasibility of a theatre to the quality of shows it premiers.

These value-based schemas are not exclusive to tech-enabled instances. Australian company Orica is the largest supplier of commercial explosives and blasting systems to mining operations on the globe. In a outcome-based approach, Orica earns revenue based on the quality of “broken rock” it delivers. The size of the broken rocks are representative of the value mining companies are paying for because smaller rocks are easier and cheaper to dispose of, with 80 percent of total mine processing costs being tied to poorly controlled blast outcomes.

These inefficiencies are pervasive all throughout commerce but breakthrough technology is now enabling solutions to be found by way of impact data. Impact data sheds light on when and how customers consume products and services, and how well these offerings actually perform. In other words, impact data helps align incentives.

Impact data also enables transparency; allowing organizations unfettered access to changes in behavioral patterns so that they may improve exchanges by adopting a revenue model that is most efficient.

This efficiency comes with a compromise in privacy. Privacy is most likely the most important priority for consumers this decade, presenting an obligation previously non-existent in the customer relationship. Customers can demand accountability so that their data only be used in their individual interests; companies can, in turn, demand that customers use the product or service in a way that achieves the best outcome. Knowing they can make in-depth comparisons more easily than ever before, customers will naturally gravitate toward sellers that adopt a revenue model best aligned with the value they derive.

Commerce is no longer about many customers’ needs, wants, and actions; it is about one customer’s needs, wants, and actions, determined using the aggregation of impact data. Context is absolutely critical. Conditions like the weather, time of day, location, or even a customers state of mind can have a positive or negative impact on access, consumption, and performance.

The market has methods of crushing underperformers, those which cling to the traditional ownership model of yesteryear. Healthcare has no such mechanism, but as you’ll soon see, data is being leveraged in ways to improve the ways we take care of each other.

Computers can offer a second opinion, crowdsourcing medical expertise like Medscape Consult, used by a growing online community of 37,00 physicians, can mitigate the effects of availability and confirmation bias. Beyond crowdsourcing, what is a computer was the crowd? The ability to interpret and contribute to first-rate medical diagnosis the goal for many current AI researchers.

Natural Language Processing (NPL) is a computer’s attempt to “understand” what is written by humans; a semantic understanding rather than a literal translation. This tool was exemplified in IBM Watson’s ambitious 2013 campaign to improve medical diagnoses, with an advertisement with a doctor claiming, “I can read 5000 new medical studies a day and still see patients.” IBM spent millions of dollar working with leading medical centers to devour patient data, medical images, patient histories, biomedical literature, and billing records. But teaching a machine to read a medical record turned out to be much harder than anyone thought, thanks to unstructured data, acronyms, shorthand phrases, different writing styles, and human errors. After spending billions of dollars in record access and company acquisitions, the project was of little value and subsequently never used. This does not mean that we do not need machine-assisted diagnosis, this means that we have a ways to go; marketers beware. In machine vision, progress is more salient.

The average radiologist in the United States makes as much as the President: $400,000/year. This compensation comes with a caveat of accountability for radiologists, as 31 percent of American radiologists have experienced a malpractice claim related to a misdiagnosis; elected representatives have no such legal repercussion when misrepresenting the needs of their constituents, except by recall.

The algorithmic review of medical images, a new field of research referred to as radiomics, mitigate rote pattern-matching in a industry that demands logical analysis. The problem, again, comes down to scale: an efficient radiologist can evaluate around 20,000 films annually, compared to the billions capable by computers.

Convolutional neural networks (CNN) are machine learning algorithms that take an input image, discern important features and aspects (by continually adjusting model parameters) to identify objects. This advanced computational method works similarly to how our own brains process visual stimulus.

Harvard and Massachusetts General Hospital applied machine learning to the mammography images of over a thousand patients and, when coupled with biopsy results indicating high-cancer risk, found that more than 30 percent of breast surgeries could be avoided.

The University of Tokyo developed a CNN for the CT scans for classifying liver masses trained using 460 patients had 84 percent diagnostic accuracy. Computer scientists at Seoul National University developed a deep learning algorithm using 43,000 chest X-rays in search of cancerous lung nodules.

A significant reduction in costs for image-processing and comparable performance to expert radiologists may suggest that AI is not just coming for blue collar jobs, but also for the most coveted positions in medicine.

AI also has profound implications for improving mental health outcomes, especially when humans prefer to share their deepest secrets with machines rather than other humans. Using an array of biometric data, “digital phenotyping” is the method of digitizing state of mind.

Digital phenotyping of mental state.

While imperfect, this unlocks the possibility of early detection of depression and/or other serious mental illnesses and getting people the help they need.

Deep learning is enabling an “automation of science.” The most relevant domains of application are in genomics and drug discovery. Despite having mapped the entire human genome in 2003 — all 6 billion pairs of A, C, G, and T— 98.5 percent of that genetic data does not code for proteins, so deriving actionable insight for understanding gene expression remains elusive. One early initiative, DeepSEA, was published by Jian Zhou and Olga Troyanskaya in 2015 at Princeton University, was an effort to understand how DNA sequences interact with chromatin. Chromatin is made of large molecules that help pack DNA for storage and unravel it for transcription into RNA which then gets translated into proteins —as a envelope that contains genetic information. This breakthrough in deep learning helps us to understand the regulatory role of a given DNA sequence.

The discovery of drugs is yet another application in which computers can digest chemical compound datasets to decide what to synthesize based on their potential for favorable interactions with a particular disease. Algorithmic automation of the screening of molecules based on their known structures and single-step organic chemical reactions have enabled researchers and startups alike to narrow down a plethora of potential compounds to a few feasible candidates. This can vastly reduce the enormous expenses incurred in bringing a novel drug therapies to market.

“The drug we all take multiple times a day that needs to be personalized most is food.” -Lisa Pettigrew

The idea that foods can be recommended on the scale of government-issued food pyramids is physiologically implausible. This implausibility can be traced back to the corruption in the food industry. The Sugar Association has been commissioning research for decades to suggest that sugar is not connected with obesity, echoing the claim that a calorie is a calorie, regardless of the source. The heterogeneity of our metabolism, microbiome, and environments led researchers at the Weizmann Institute of Science in Israel to conclude that each individual reacts differently if given the same amount of the same foods. The emerging field of nutrigenomics is far from using our DNA to produce personalized diets, but offers a earnest first step: predicting glycemic responses. In a study of eight hundred individuals, screening for non-diabetics, blood glucose levels were carefully monitored after standardized meals from a under-the-skin sensor (Segal et al., 2015). Unique information was gathered regarding time of meals, food and beverage content, physical activity, height and weight, gut microbiome and blood tests. The findings concluded that the gut microbiome — not carb or fiber intake — was the primary driver of glycemic response.

Gut microbiomes, 40 million cells comprising thousands of different species, play significant role in our understanding of food intake. While glycemic spikes are not a end-all metric, knowing if you have a larger than average blood glucose spike after eating could indicate high risk for diabetes. High glucose has be directly linked to the permeability of the gut lining, increasing risk of infection and cancer.

Valuable applications of these data-intense technologies are all well and good, but what kind of consequences are to result from these transformations?

Section III — Implications

The whole idea of an “Ends Game” assumes companies know to what end they are headed. Defining outcomes that align interests and are thus reliable foundations upon which to build a revenue model require four preconditions. They must be: (1) meaningful to the consumer, (2) measurable using parameters well understood by both firms and consumers, (3) robust in the sense that these metrics are a faithful representation of the underlying interest of the business, and (4) reliable as to prevent consumers or third-parties from faking performance (step-counter on a metronome).

When a company is so proud of the “quality” products it brings to market, operations are focused on maximizing the revenue of its offerings rather than maximizing their value to its customers — think, big pharma and the opioid epidemic — it is referred to as the quality paradox. A company can become so keenly focused on improving the measurement of an underlying construct of interest, that eventually, the measurement replaces the construct altogether; this is when “surrogation” occurs. This happens when an organization known for innovation begins to worship it’s products and starts to suffer from tunnel vision. Not to mention, innovating a revenue model is expensive; each dollar spent on R&D is not spent servicing debts or repaying shareholders. The traditional ownership model is “safe” and conveniently offers no to guarantee of performance.

As previously mentioned, impact data is what allows lean commerce to take place. There are three classes of data companies use that descends the depths of the customer journey: (1) data on the consumer needs and wants, (2) the steps taken by the consumer to arrive at a particular solution, and (3) deeply personal insight into behaviors, patterns, and tendencies consumers prefer to remain hidden. Transparent privacy policies are necessary for efficient and trustworthy transactions to take place. If data is properly utilized, if both the consumer and firm are honest in their approach, tomorrow’s products can benefit all those who seek solutions, regardless of their means.

Big data in medicine also reveals healthcare inefficiencies. What can we look forward to as medical machines become more and more capable?

A consequence of AI augmentation is the gift of time, the lack of which contributes to burnout and medical errors, which leads to more burnout. A 2018 paper from the National Bureau of Economic Research found patients recently discharged from acute care have a reduced readmission rates for every minute extra spent with a care provider. The Time Bank project at Stanford University’s medical school seeks to reward doctors with vouchers for time-saving services — like meal delivery or house cleaning — when they engage in underappreciated work like mentoring, serving on committees, or filling in for colleagues. Natural language processing can remove the computer from the exam room, a measure that was found to reduce physician burnout from 53 to 13 percent, while improving patient outcomes. A mission shared by Fabio Schmidberger, founder of voize, a virtual medical documentation company.

As our computers become increasingly better at narrow task performance, it would be appropriate for us to consider what differentiates us from machines: empathy. A systematic review of the effect of a doctor’s ability to empathize found a positive relationship with improvements in clinical outcomes, patient satisfaction, adherence to recommendations and prescriptions, and the reduction of anxiety and stress. To bear witness to the suffering of another, to heal and not just diagnose, is preferential to endless time spent on administrative tasks.

As Stanford University professor, Abraham Verghese, puts it, “Being present is essential to the well-being of both patients and caregivers, and it is fundamental to establishing trust in all human interactions.” Instead of listening, doctors interrupt. It takes, on average, eighteen seconds after the start of an encounter for them to interject. Cutting to the chase ruins a perfectly good opportunity to earn the trust of those they’re meant to care for. Assuming doctors can allow room for patients to reveal their ailments in more detail, another tool is vital for presence: the medical gaze. Careful and detailed observation can restore human-to-human connection.

Another ritual of establishing intimacy is the physical examination. Disrobing for an examination is a sacred threshold of trust that cements a patient-doctor relationship. Beyond this, the ability to deliver bad news and initiate the healing process is something only an empathic and reassuring doctor can do; never to be delegated to an algorithm.

A profession that once demanded supreme memorization skills, now outsources knowledge to computers and restores the ability to develop a deep relationship with patients; to witness and alleviate their suffering. Medical schools should consider adding deep empathy to their curriculum alongside teaching for technical proficiency.

Presently, the severe disconnect between firms and customers, and between patients and doctors is but a matter of the tools for leveraging data. Whether it be products or services, we’ve taken a journey along the bleeding edge of the Big Data Revolution to discourage the luddite pessimism associated with artificial intelligence. I hope you can imagine a world in which we can harness intuitions about our artifacts and institutions in unrelenting service of the outcomes we deserve. I hope you move forward with the courage to build the future you wish to see in the world, for the tools have never been so powerful.

--

--