Tech’s Inevitable Journey Towards Human-Centered Design

Hessie Jones
beacontrustnetwork
Published in
12 min readSep 7, 2021

Technology is moving at a speed where business is struggling to keep pace. It is obsoleting established norms in the process. The imminence of AI is creating an urgency to create policy to mitigate existing and future harms… and yet business is slow to adapt. When the biggest companies that collect, aggregate, and contextualize data fail to remedy or change the way they do things, more vulnerabilities are unleashed. This is my journey from a customer-centric mindset, which I’ve lived with most of my adult career, towards data privacy and a more human-centered mindset.

These days, I am a writer and strategist. I’m also a privacy and tech ethicist.

I graduated from university with a business degree. I loved the world of marketing and advertising. That was my calling. For more than 20+ years, my job as a marketer was to find the right customers for the products and services a company makes. I’ve sold everything from migraine medication, tacos, pizza, credit cards, to media properties and platforms, and behavioral technology AI. Each journey to find the right customer always began with data.

EVERYTHING BEGINS WITH DATA

The goal of business is to sell stuff. The more you buy, the more the company tries to sell you. The more you refer other people, the more the company rewards you. This perpetual cycle continues to find more people like you. So over time, data has become an increasingly invaluable asset. Today, it’s everywhere. The more people put themselves online — their conversations, their behaviors, instrumenting their bodies, their cars, their homes — the more people like me love it!

As of August 2020:

Today, 3.96 billion people use social media, which accounts for roughly half (51%) of the global population. On average, global internet users spend 144 minutes on social media sites every day. People, on average, interact (swipe, type, click, tap) with their phone a whopping 2,617 times each day, that equates to 150 mobile-device sessions.

Recently, on Twitter I saw this:

For marketers, it starts out as a guessing game — we developed a brand and “created” the market that we thought would consume this brand. Through TV ratings, impressions, and correlating those things to purchase we had an idea where these products were being bought. More importantly, we were able to discern household demographics, income, location, etc. What was our goal? We wanted to find out more about our audience so we could somehow “get into their heads’’ and understand what makes them tick. Ultimately, we wanted to find the ideal customer that would become our most profitable one. We looked to established practices like the “buying funnel” to understand the consumer’s path to purchase.

This idea of customer-centricity was at play to figure out their needs, their wants, and their propensities with the ultimate goal to predict what they would do.

DIGITAL MADE THINGS SO MUCH EASIER

When digital came along, we started to make associations with this buying funnel because we had so much more information. We had the sites people would go to. We had their location information — where they lived, where they worked, when they checked in at the local pub, at a party, or at a retail store. At an aggregate level, we also had an idea of how people searched for the things we were trying to sell. I remember leading the transformation of a bank to bring marketing and credit card acquisition online and I fought with the central internet team for months over embedding this double click 1x1 innocent pixel to allow us to measure our campaigns. This was definitely a new frontier.

Social Media gives us this treasure trove of data. People were spilling their lives online — rampant threads of conversations about anything and everything. There was an enormous trust within tightly knit forums that allowed people to let down their guard and be transparent about their views and opinions. This vulnerability and honesty got people like me even more hungry to correlate this notion of “intent” with motivation and ultimately, with purchase activity.

Being part of an ad platform, we were being measured and compensated through our performance with 3 main drivers: reach, revenue, and engagement. Engagement was the most important metric. How long could we keep people on our site? What variables could we identify that would make this more likely? Why was this important? Because the longer folks spent on our site, the more impressions they’d generate, hence the more ad inventory and the more opportunities for revenue.

THE HOLY GRAIL

I thought that if we had this whole picture of the customer — from their social profile to their online behaviors, including their business transactions — then we would stop guessing.

What if we could combine transactional information with behavioral data? In 2011, I started my own Big Data consulting and this is what I was selling. The ability for companies to gain deeper context into people’s lives, their motivations, their emotions, who and what influenced them — for profit, which was enticing. What we, as marketers, had at our disposal through digital tools and automation was enormous!

I worked for a start-up that gave business insights from social media, including influence, demographics, location, sentiment, clicks, profiles, views, topics, and content (i.e., video, blogs, etc.), to help businesses know more about who their customers were as people. More specifically, their behavior.

Fast forward, in less than 5 years, this demand has created a massive data brokerage market.

Today, companies like Acxiom, one of the largest database marketing companies, have monitored the records of hundreds of millions of Americans obtained through 1.1 billion browser cookies, 200 million mobile profiles, an average of 1,500 pieces of data per consumer. In May 2014, they paid $310MM in cash for a company called LiveRamp, which enables marketers to use consumer’s offline purchases to better target online ads to those consumers.

To understand the scope of access to personal information, “offline data includes information from real estate, motor vehicle records, information from warrant cards, homeownership, property values, marital status, annual income, education levels, travel records, ages of children in the home, itemized store purchases from loyalty discount cards.” Pharmacies can sell your purchase data to a data broker, who will then resell this to a health insurance company.

In his book, They Know Everything About You, Robert Scheer reveals that the Department of Motor Vehicles — in most US states — “can and does legally sell driver data, [including] name, address, car model, VIN, license plates. The transportation security agency, for example, purchases data from data brokers to prescreen travelers.” One consultant from the Identity Project, a non-profit initiative that opposes identity-based domestic security programs, said the best way to look at it is a “pre-crime assessment every time you fly.

This demand to know more — from a marketing standpoint — was innocuous at first but over time, has become insidious.

The Social Dilemma on Netflix validated that same metric — engagement — has not changed to this day. It has been a way for platforms to manipulate people’s behavior. These companies use our data to make models. The more data, the more accurate the model. Shoshana Zuboff, the author of Surveillance Capitalism, calls this a “marketplace that trades in human futures.” In other words, a prediction market for future human behavior. It’s as B.F. Skinner, the famous behavioralist, who saw human action as dependent on the consequences of the previous action, posits in his Theory of Reinforcement. Here lies the key to modifying our behavior by offering positive reinforcement — the nudges, the likes, the comments, the notifications.

A famous quote from the documentary, The Social Dilemma:

We have successfully leveraged the attention economy and created an addiction economy… Social media has single handedly created a mechanism to enable our thirst for connections to keep us addicted and manipulate our actions in the process.

Brian Solis, who writes about the future economy said this:

We now live three lives online — one that disappears, one that is secret, and one that sculpts our legacy.

After 20 years of data marketing experience, I now consider myself an anti-marketer. I realized that we were crossing a very dangerous line. We need to question whether businesses need all that data in order to make sound decisions? Absolutely NOT. The issue is that technology has created what is today a standard for how information is used, collected, and managed. Policy and regulation continue to lag in the background, slow to keep up with these technology harms as they arise.

THE BLATANT REALIZATION THAT NOTHING YOU DO ONLINE IS PRIVATE

In 2013, Edward Snowden exposed PRISM and the inner workings of government. In that time, information began to surface. Are Google, Facebook, and big tech giving the Federal government unlimited access to our data? The events of 9/11 created the need for more information and resources to stop these types of occurrences from ever happening again. I wrote a post entitled: The NSA, Privacy and the Blatant Realization: Nothing You Do Online is Private and started a conversation on Facebook: “Why I have nothing to hide is the wrong way to think about surveillance.” My friend Julie Pippert said then, “I don’t know who in the world has nothing to hide. I have nothing ‘illegal’ but plenty to hide and that’s why I’m not ok with this.

Modern technology platforms — Google, Microsoft, Amazon, Facebook, IBM, Apple (G-MAFIA) and Baidu, Alibaba, Tencent (BAT — equivalent oriental big tech) are more powerful than most people realize. If they all succeed and every online user in the world adopts their technologies, societies will change forever. Collectively, they are already initiating a global paradigm shift unlike any we’ve seen before.

On September 2, 2020, The (NSA) surveillance program has been ruled unlawful, seven years after it was exposed by Edward Snowden. “It made clear that the NSA’s bulk collection of Americans’ phone records, our emails, and our communications violated the Constitution.”

When Snowden revealed the inner workings of Prism, the general public was concerned and was up in arms with the advertising industry. They had no idea the extent to which their information was being used. One executive from Macy’s said this:

There’s a funny consumer thing… They’re worried about our use of data, but they’re pissed if I don’t deliver relevance. … How am I supposed to deliver relevance and magically deliver what they want if I don’t look at the data?

The industry standard of data gathering is pervasive. You are still being tracked. Publishing sites — your favorite news sites — use these very mechanisms. The first time you visit a website you are exposed to first-time cookies that allow the website visited to identify you, track your site activity, permitting the site to determine which articles are the most popular for readers. For marketers, this is not new. This is how we justified, “we want to make your site experience better.” I spoke to Cate Huston, a director of engineering at DuckDuckGo who confirmed that it is an “antiquated justification.”

What we’re witnessing today is excessive data sharing, without our knowledge and without our consent.

When you conduct a site search, information about that query is sent to Google (which operates a search box at the top of the screen). When you play a video on the site, it sets cookies on your browser that sends your search and viewing history to Youtube/Google, which stores it indefinitely and combines it with your other data, to complete a picture of you.

What I didn’t realize back in 2000 was when I was fighting with our central bank team about embedding an innocent 1x1 DoubleClick pixel (now owned by Google) into our banking systems was that it did not only allow me to attribute the ad from where the customer clicked, it became the first stepping stone towards user profiling.

Cathy O’Neil, author of Weapons of Math Destruction, and whose book changed my path towards data ethics said this,

people across the internet have produced quadrillions of words about our lives, our work, our shopping, our friendships. By doing this, we have unwittingly built the greatest-ever training corpus for natural-language machines.

WHO OWNS THE DATA?

Custodians of data (banks and health care) contend they own the data. Arguably today that notion is challenged. But organizations will be hard-pressed to give up control. What’s at stake for business? Will it affect business decisions? Can I remain profitable? Can advertisers change their practices and still be able to target effectively? One media VP from Walmart, Dana Toering, with extensive experience in AdTech, said to me that people want better experiences but they will demand more privacy from the outset. From his perspective business can still thrive within this new reality where consumers want more transparency and data control.

FROM A CUSTOMER-CENTRIC TO A HUMAN-CENTRIC ENVIRONMENT

This means moving from a practice where we collect whatever we need to profile and predict what customers will do, to human-centricity, where we give people the tools and the authority to assert their right to self-determination when it comes to their personal data and is fostered within an environment that caters to this principle.

I co-founded MyData Canada in July 2020. Our organization recently responded to the call for public consultation for Privacy Law Reform for the Private Sector to share our proposed recommendations. We acknowledged that laws alone cannot fix this. We need two sides:

  1. Promotion of privacy-preserving, free-flow of data across borders i.e. harmonization of privacy and data protection standards across borders.
  2. Technology-aware digital ecosystems to promote transformation supported through regulation. This will allow low-cost adoption by businesses, i.e., low friction, human-centric products that will empower individuals while enabling responsible data sharing.

3. We have to ensure that this new privacy regime is comprehensive. This means:

  1. Addressing the issues of data justice and bias in algorithms and how people are represented as a result of their production of data. This includes Algorithmic Impact Assessments and the establishment of clear guidelines for fair explanation while protecting trade secrets.
  2. Prioritizing children’s privacy as a means of keeping children safe while ensuring young people can thrive.
  3. Emphasizing citizen control of their information as well as explicit positive consent. Last year, I conducted research on EdTech companies, during a time when many of our children were forced into remote learning. What surfaced was the privacy disparities and safety harms that plague younger students behind a computer — and the behaviors and keystrokes that were being monitored and manipulated.
  4. Extending this to applicable third-party providers across jurisdictions and strengthening enforcement.
  5. Making data portability a reality. This is happening now with the use of verifiable credentials. We need to continue to encourage investment in innovation for responsible data practices and solutions.
  6. Giving businesses the tools to meet their core objectives while respecting Consent by Design and supplying them with incentives to support privacy-preserving best practices.
  7. Educating and empowering individuals through healthy digital choices, including privacy best practices, and digital literacy. Documentaries like The Great Hack, Social Dilemma, and Coded Bias help tremendously to drive this awareness.

CAN WE REIMAGINE A FUTURE WHERE WE, AS CITIZENS, HAVE MORE CONTROL OF OUR INFORMATION?

It’s definitely coming, but the laws need to have teeth and they need to be enforced. However, if the government doesn’t invest in and enable an ecosystem to make this possible, things will not change.

In an advertising industry that drives hundreds of millions in revenue, the quest to build consumer relevance comes at a cost. But now there is a new sector for privacy tech that is growing. The sector amassed nearly $10 billion in investment in 2019, according to Crunchbase, a 10X growth from 2010.

Joe Toscano, Founder, and CEO of Beacon Trust Network left Google and traveled the world to bring awareness to what was happening in Silicon Valley. He wrote a book called Automating Humanity and is known for his work on Netflix’s Social Dilemma. He said this:

Imagine a world in which we own our data. We have a right to determine how it’s to be used, and it’s matched in a world in which the terms of service are written in a language we can understand and reasonably be expected to give consent. Imagine a world in which we profit off our data instead of giving it away for free to companies that are used to making billions off us. It’s coming. But getting there depends on us acting and demanding more from companies and our own legislators.

The long-term gain is really about doing what’s right and fair, by removing inequities. This means a change in mindset for innovation to create better human and business outcomes — also known as a positive-sum market solution.

This article is part of BEACON Newsletter. If you would like to subscribe, please Sign up.

--

--

Hessie Jones
beacontrustnetwork

Advocating for human-centred & fair distribution of#AI and #DataPrivacy — Author/Writer, Forbes, Co Founder MyData Canada, PIISA.org, Women in AI Ethics