Last week, Facebook made waves again when its free VPN app, Onavo, was pulled from the iOS app store for violating Apple’s data-gathering policies. Data collected by the Onavo app allegedly lets Facebook monitor how people use their mobile devices, even when they’re not using the main Facebook app. The irony here is comically stark, as a VPN app that’s supposed to block harmful websites and redirect users’ web traffics for added privacy turns out to be spyware, watching every web link they click and every app they open, all the while offering nothing of value in return for collecting those data. Rather, that data was solely collected to give Facebook insights into mobile behaviors that could give the company a competitive advantage over its rivals.
The fact that such a dubious app got greenlit in the first place and remained available on the App Store for years is illustrative of a troubling trend in our increasingly digital, algorithm-driven world — the tendency to treat consumers as mere data entry points to be collected, analyzed, and fed back into the marketing machine. It is a symptom of an algorithm-oriented way of thinking that is quickly spreading throughout all fields of natural and social sciences and percolating into every aspect of our everyday life. And it will have an enormous impact on culture and consumer behavior, for which brands will need to be prepared.
A.R.E.A.M: Algorithms Rule Everything Around Me
At its core, an algorithm is a process or set of rules to be followed, usually by a computer, in calculations or other problem-solving operations. It could be as simple as a food recipe that you can follow step by step, although nowadays they are more than likely to be as complex as the code for machine learning programs.
Algorithms are great for calculation, data processing, and automated reasoning, which makes them a super valuable tool in today’s data-driven world. Everything that we do, from eating to sleeping, can now be tracked digitally and generate data, and algorithms are the tools to organize this unstructured data and whip it into shape, preferably that of discernible patterns from which actionable insights can be drawn. Without the algorithms, data is just data, and human brains are comparatively ill-equipped to deal with large amounts of it.
Without the algorithms, data is just data.
Thus, the work of processing data is increasingly delegated to the algorithms, and with that, we relinquish a crucial part of our decision-making process to the digital design. Today, algorithms rule over various domains such as fashion, publishing, entertainment, finance, insurance, travel experiences and online dating. They already influence what Netflix show we watch, decide what news we read on social media, diagnose cancer in under two hours, and can even predict when you are going to die. And before you know it, algorithms will soon take over other crucial domains such as transportation, urban planning, healthcare, and education, all of which will have profound impact on our overall quality of life, for better and worse. There is even a religion that treats A.I. as its God and advocates for algorithms to literally rule the world.
Even the world of science and academia is no exception to the ever-expanding regime of algorithms. In his international best-seller Homo Deus, historian and author Dr. Yuval Noah Harari points out that dataism — an emerging ideology that “declares that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing” — has already become the dominant thinking in the global scientific establishment. Under the influence of dataism, all social sciences are now chasing the algorithmic pattern supposedly hidden within our socio-economic activities, just as all natural sciences are trying to decode the organic algorithms of nature. Everything is but data flows, and algorithms rule our world by organizing and making sense of data for us.
Everything is but data flows, and algorithms rule our world by organizing and making sense of data for us.
The Dangerous Allure of Dataism
Dataism is especially appealing because it is so all-encompassing, With Datasim and algorithmic thinking, knowledge across subjects becomes truly interdisciplinary under the conceptual metaphor of “everything as algorithms,” which means learnings from one domain could theoretically be applied to another, thus accelerating scientific and technological advances for the betterment of our world.
Given how efficient and data-driven they are, we increasingly trust the algorithms to get us the optimized results, which further enhances the allure of dataism. Such is the promise of saving money on air travel by tracking and predicting plane ticket prices with algorithms designed by Google or Hopper, and so is the joy of discovering new music via Spotify’s algorithmically curated playlists, or the engineered serendipity of discovering a product you didn’t know you want via targeted ads. They can even save lives with the application of algorithms and machine learning in disease diagnosis and biometric tracking, as evidenced by several cases of Apple Watch alerting users of vital anomalies.
In a way, the takeover of algorithms can be seen as a natural progression from the quantified self movement that has been infiltrating our culture for over a decade, as more and more wearable devices and digital services become available to log every little thing we do and turn them into data points to be fed to algorithms in exchange for better self-knowledge and, perhaps, an easier path towards self-actualization.
The takeover of algorithms is a natural progression from the quantified self movement.
Such allure of dataism and algorithmic decisions forms the foundation of the now-cliched Silicon Valley motto of “making the world a better place.” Algorithm-driven systems typically carry an alluringly utopian promise of delivering objective and optimized results free of human folly and bias. When everything is based on data — and numbers don’t lie, as the proverb goes — everything should come out fair and square. As a result of this takeover of algorithms in all domains of our everyday life, non-conscious but highly intelligent algorithms may soon know us better than we know ourselves, therefore luring us in an algorithmic trap that presents the most common-denominator, homogenized experience as the best option to everyone.
More often than not, however, algorithms have failed to deliver on those lofty promises. Multiple revelations of racial and gender discrimination by algorithms in recent months shined a light on the deeply problematic bias engrained in the design of some algorithms. Algorithmic bias exists because algorithms are designed by humans, in particular, a specific, often homogeneous demographic of software developers. They reflect the implicit values of the humans who are involved in coding, collecting, selecting, or using data to train the algorithm. When data is selected and coded with unchecked bias, the algorithms become biased too. Fortunately, Facebook has formed an AI ethics team to develop tools for preventing bias in its algorithms and data, as has Google, so hopefully that will help us get ahead of the problems.
When data is selected and coded with unchecked bias, algorithms become biased too.
The Impact of Algorithmic Decisions
The rapid spread of algorithmic decision-making across domains has profound real-world consequences on our culture and consumer behavior. For example, the Chinese government has started to roll out a “social credit score” system that uses algorithms to track and rate the “trustworthiness” of the country’s 1.4 billion citizens, based on collected data of their online and offline behaviors. If your score dips below a certain threshold, you will be banned from traveling, from getting loans or jobs, or even staying in hotels, all in the name of preventing crimes and maintaining social order. So far, the system has blocked people from taking 11 million flights and 4 million train trips.
While that may sound downright dystopian, the even more troubling fact is that algorithms often work in ways that no one fully understands. Since algorithms are usually only programmed to provide an answer based on the data they’ve been fed, we can clearly see the results of their number crunching but most of the time we have no idea how they arrived at them. For example, the use of algorithms in financial trading is also called black-box trading for a reason. When you stack various incredibly complex algorithms atop one another, the entirety of this automated trading system develops a digital brain of its own that conducts calculations at a speed and complexity that far exceed human comprehension. Similarly, few people know how exactly the social credit score in China is calculated. How the algorithms evaluate and weight various online and offline behaviors to formulate the score is kept opaque, although political and ideological agenda is presumably a major influencing factor in its design.
Those characteristics of unknowability and, sometimes, intentional opacity also point to a simple yet crucial fact in our increasingly algorithmic world — the one that designs and owns the algorithms controls how data is interpreted and presented, often in self-serving ways. 23andMe, a genetic testing company that has sold at-home DNA testing kits to over 5 million people, recently announced it is shutting down its API and thus will no longer allow third-party app developers to access to the anonymized raw genomic data it collects. Instead, developers will only be able to use reports generated by the company for their apps and services. While this change is officially implemented in the name of protecting the data privacy of its customers, it also concentrates the power of data interpretation and evaluation to the algorithms that 23andMe designs.
The one that designs and owns the algorithms controls how data is interpreted and presented
In reaction to that unknowability, humans often start to behave in rather unpredictable ways, which lead to some unintended consequences. Take the world of romance novel publishing for example. The algorithms that Amazon uses to regulate its Kindle Unlimited ranking system have completely transformed how romance novels are being written and marketed today. Long books are ranked higher in search results, so the romance novels become bloated with appendixes and other bundled materials. A certain keyword or theme catches on and becomes “trending”, and suddenly everyone is coming out with a title with that keyword or theme. Inappropriate, violent materials may get inexplicably promoted by opaque algorithms with little human interference, and unethical marketing and algorithmic tricks are running rampant, similar to how some content creators are gaming YouTube’s algorithms to show inappropriate content to kids. Alternately, authors collaborate and sabotage against each other to game Amazon’s algorithm, which changes precariously without warning as Amazon sees fit.
Of course, Amazon’s algorithms have had an even bigger impact on how we shop online. Every logged-in user of Amazon.com sees a customized homepage filled with products handpicked by Amazon’s algorithms based on what you bought, what you clicked on, and what other people who bought what you bought also purchased. Algorithms already reign supreme in product discovery, but now they are starting to further erode shoppers’ free will by automating purchases. Last week, online wholesaler Boxed announced a new autonomous shopping concept called Concierge, which relies on machine learning-based predictive analysis to preemptively fulfill a re-order of an item when the customer is expected to be running low. No customer engagement is necessary, unless you put a stop to the automated process. Similar auto-replenishing initiatives in development also include Target Fetch and Amazon’s Dash Replenishment Service.
Ultimately, the most profound impact of the spread of dataism and algorithmic decision-making is also the most obvious one: it is starting to deprive us of our own agency, of the chance to make our own choices and forge our own narratives. Take the simple act of online search for example. At the moment, Google serves up pages of hyperlinks ranked by its algorithms based on relevancy to our input. We then glance through the links to see which webpage we want to click on and learn more from. But with the quick adoption of smart speakers with voice assistants, more and more searches are now conducted via voice. And as is the case with natural conversations, voice search typically only returns one answer at a time, thus drastically limiting the search results to only the top ones deemed worthy by algorithms. The more trusting we grow of algorithms and their interpretation of the data collected on us, the less likely we will question the decisions it automated on our behalf.
Algorithmic decision-making is starting to deprive us of our own agency
Survival Tips for Brand Marketers
Some worry that when algorithms get to call the shots for consumers, brands will no longer matter, since brand marketing is geared towards consumers, not algorithms. And if brands don’t aptly prepare for the impact of algorithms, the platform owners will surely reap the benefits as designers and owners of algorithms. In an increasingly algorithmic world, how can a brand continue to influence consumer decision and maintain its customer relationship?
First of all, it is important to acknowledge that algorithms are a double-edged sword. Yes, brands have a lot to gain when they embrace algorithms and take full advantage of its power of data-crunching efficiency, and they should actively integrate algorithms into their data analytics and decision-making process, to a certain extent. If your brand ever gets to design an algorithm for your own use, it should be designed with a consumer-first value in mind. When dealing with algorithms, it is important to treat your customers as real people, not just an entry in your database at the mercy of potentially biased algorithms. The application of algorithms should be used to enhance the customer experience, which is always the key to maintain your customer relationships.
Algorithms should be designed with a consumer-first value in mind.
Take ecommerce for example. As Ariel Ezrachi and Maurice Stucke point out in their book Virtual Competition, the application of algorithms in online shopping offered consumers and vendors many benefits and changed the nature of market competition. Vendors can now employ algorithms to monitor competitors’ prices 24/7 and adjust their own prices accordingly. But they can also track and profile consumers to get them to make a purchase at the highest price they are willing to pay, as calculated by algorithms. The first case promotes price transparency and benefits shoppers, while the second one clearly uses it to maximize profits with no regard for consumer’s best interests.
Circling back to the case with Facebook’s VPN app, it is clear to see that the underlying issue of collecting mobile data stems from a lack of core value on Facebook’s part. Time and again, Facebook’s choices revealed that connecting people is considered a de facto good in nearly all cases. And for far too long, Facebook had a blind optimism in this grand mission of digitizing our social relationships, divorced from the harsh reality of human society. It is a kind of blind-spotting that happens when algorithmic thinking takes over and everything gets distilled to mere data.
Second, smart brand marketers know how to respect the algorithms and when to challenge its results. Treat algorithms as a useful tool for processing data and produce valuable insights, but beware of letting them determine your brand narrative. While there is a good reason to trust the results of algorithms most of the time, it is important to remember that they are far from infallible. Fine tune the algorithm you use to keep up with shifting consumer behaviors and factor in new market forces. After all, algorithms are only as good as the data we feed them. Respect the algorithms, but don’t blindly follow them.
Respect the algorithms, but don’t blindly follow them.
For example, Hollywood has long been using algorithms to predict the box office performance of a prospective release based on data from test screenings and historical data of similar movies. This summer, the marketing team at Warner Bros. studio pulled off a higher-than-expected (as in, projected by algorithms) opening weekend for its shark flick The Meg. The team wisely rejected the initial opening weekend projection of $20 million with the movie positioned as a generic summer action movie, and instead managed to sell it as a rare horror-comedy to the tune of a $45 million opening weekend.
Lastly, it is crucial to bring a human element back into your decision making. Algorithms can do a lot of things, but it can’t set your brand narrative for you. Thankfully, the ability to forge a cohesive, meaningful narrative out of chaos is still a distinct part of human creativity that no algorithm today can successfully imitate. Algorithms may be able to write summaries for news articles, but one look at an algorithm-generated script reveals its shortcomings.
Algorithms can do a lot of things, but it can’t set your brand narrative for you.
While owners of algorithms get to control how data is interpreted and presented, it is up to the human marketers to decide how those data-driven insights fit into your brand narrative, which is the driving force behind all successful branding. Therefore, brands should always maintain a human touch amidst the takeover of algorithmic decision-making. And that branding will be the crucial differentiation point to connect with your consumers to drive them to ask for your brand by name, therefore bypassing the algorithmic recommendations and automated decisions.
Algorithmic Marketing vs. The Art of Storytelling
Five years ago, everyone in the marketing world was abuzz about the transformative potentials of big data, which promises to bring forth sharper insights into consumer behavior and enable data-driven decision-making. A lot of companies jumped on the bandwagon and started chasing after first-party and third-party data from retail outlets, social media and ad networks to paint a portrait of their online and offline activities. Big data analytics made it possible for brands to discern overall trends in their sales and campaign ROIs down to each regional market and compare them against each other to gain further insights. It made marketing a numbers game.
Fast forward to today, hardly anyone in the industry still uses the term “big data” anymore. Like every other buzzword, it has gone out of style. Looking closer, however, you will see that its spirit lives on in the talk of Artificial Intelligence, the hot topic du jour. When marketers talk about application of A.I. today, most of the time they are talking about using machine learning-enabled tools to analyze big data and gain insights. After all, raw, unstructured data means nothing, and their exponentially growing quantity in the digital age disqualifies human brains to ever process everything. That’s where the algorithms come in, rebranded as analytic tools powered machine learning and rudimentary A.I.
Even as many aspects of marketing become transformed by algorithms and hard numbers, brand marketing remains primarily an art form that leverages creative storytelling and myth-making to create affinity and demand. Successful storytelling is all about appreciating the nuances of human behaviors, and those nuances are usually the first thing to get filtered out by algorithms. That’s why algorithm-driven branding lacks distinct identities and narratives.
We all love a good story, but we tend to interpret the same story subjectively and therefore differently. Algorithms tend to present its result as the only viable choice, the only true narrative based on data. Give that agency of choice and interpretation back to your customers, that’s how brands can ultimately win back consumers from the grips of algorithms.