Manipulative design strategies – Exploiting users’ decisions to change their behavior

Exploring the Rollercoaster approach in machine-learning algorithms that use strategically placed data to alter users’ decision-making

Noah Gustafson
SI 410: Ethics and Information Technology
11 min readFeb 22, 2022

--

Using most digital products, the deal is that you trade time for a service. Google developed a program that provides the answer to almost any question we might have. By using their information-search tool more and more, most people’s immediate behavior is now to Google the answer. You might say that seems beneficial. This is when we must ask ourselves what we benefit from in using each technological marvel we invest time in. What happens when that service’s machine-learning algorithms know enough about you to implement strategies that begin to bend the way you view responsible usage, affecting the amount of time you spend using the service? When users are misguided to make decisions they wouldn’t have made outside of using the app, it may not be to their benefit.

Manipulative design strategies go too far when they drastically change users’ ability to make responsible decisions. When the ethical contract is deliberately changed through deceptive techniques, without the victim’s conscious awareness and approval, here, the line should be drawn as addictive manipulation.

A company must understand what a user wants before it can change what they do.

The 1st, most fundamental piece of manipulative design

Did someone say big data? Fairly straight forward. You probably already know that companies collect your private information from the products you use and trade that data with other companies. Often these companies unethically alter or prioritize our data, while telling us to keep building and using technology quickly and just “use the product anyway” instead of addressing the issues with machine learning. “Race after Technology: Abolitionist Tools for the New Jim Code” argues that new methods of social control are being introduced as companies collect more and more of our data. The alteration of our decisions is increasing on smaller behavioral levels and in our larger social constructs. Although it’s very real and prevalent in our society, we won’t focus on social manipulation here, but be more-so concerned with how behavioral data is collected and converted to inducing immediate interactions.

Manipulative design strategies use this data to inform something we’ll later mention—pivotal moments in a user’s journey. Algorithms dynamically collect and use data in the same live session you use a product in to inform what content the algorithm pushes in front of you. A company won’t be able to successfully change your decision-making at a fundamental level without collecting this behavioral data.

The user must be fully immersed by strategically focusing on where and how content is placed.

The 2nd, strategic piece of manipulative design

Consider a time you were looking for new positions on LinkedIn’s mobile app during a job search or using Instagram’s recommended feed while in the restroom to pass the 10 minutes you’d be in there for. Both of these apps will just show you your recommended content loaded at the time of app launch, hoping to provide the most recent and relevant content at the top for you to interact with. Why? Because they know you’ll leave in a few minutes so providing dynamically tailored content is less important. Once you’re done taking care of business in the restroom, it’s easy to hop off. These apps’ feeds aren’t necessarily manipulating you into an addictive scrolling behavior. The only significant addictive consequence most social media feeds instigate is one reminding you to return to the app on occasion.

Manipulative design strategies target addiction creation in the present moment of use. Not only do manipulative designs make you crave to return to the app like most scrolling feeds, but in each use session, you become immersed so much that you lose track of your ability to be decisively aware of what you’re doing and for how long. If manipulative patterns are used, your stay in the bathroom may turn into 20 minutes without you even realizing it. Why would a product want this? They’re able to put more content and ads in front of you. By addicting you to return to the app and then addicting you more in using the app for longer periods, time spent with the product exponentially grows and the product’s owners just think about all the money they’re making from ad revenue. ☁️💸☁️

Let’s look at casino design, not the gambling machines themselves, but the placement of those machines. In the past century, architects who specialized in casino design knew exactly how to keep winner-hungry patrons playing for days on end, creating high points for each time they reached a low point. Some of the most decorative and exciting slot machines are placed when you first walk in, but they have incredibly high risk and low reward. It’s designed so that you get excited, use them briefly and then desire to go deeper and bigger. After you’re hooked and get farther into the building, large commitment and small-commitment slot machines are strategically scattered around the room to keep you on an emotional rollercoaster of excitement, loss, excitement, loss, excite…you get the idea. If all the machines presented a similar experience with a similar commitment, you’d eventually “wake up” and realize you’re riding a flat rollercoaster, which is stale and no fun. It’s like driving on a flat highway in the desert plains with no turns or hills—not great.

Know when a user craves certain content and how long they will interact with it before burnout.

The 3rd, most effective piece of manipulative design

Not only is manipulative design reliant on how and where deceptive visuals are implemented, but on the psychologically strategic placement of when content is revealed and for how long a user interacts with that content before seeing new content.

Apps that prioritize session duration know when to use content to manipulate users into increased use.

Session duration is the length of time culminated after a user makes X decisions/interactions within set starting and ending points of a process. For most mobile app companies, this refers to the interactions made between the moments of opening and closing an app.

With increased session duration per user, a product’s retention rate will increase and churn rate will burn out, therefore increasing the quantity and quality of viewed and interacted ads that create revenue. One of the newest and most successful examples of a product that uses this manipulative design technique today is — you’ve probably heard of it—TikTok. But let’s come back to that.

So how do companies identify the most effective moments in a user’s journey to put content in front of them and alter their behavior? Machine-learning algorithms know to reveal personally captivating content when the user reaches critical emotional points.

Robert Plutchik’s wheel of emotions

To drive behavioral change, McKinsey tells us that identifying the beliefs, habits, and peak moments of a user’s journey is enough data to drive fundamental behavioral change. Why is this possible? Something called the Peak-end Rule:

People judge an experience largely based on how they felt at its peak and at its end, rather than the total sum or average of every moment of the experience.

Here’s a graph to explain it further (please ignore my chicken scratch):

The Peak-End Rule

What if the strategy team for a product could pepper in multiple peak-end moments throughout a user’s journey? Doing this would manufacture incredibly scalable behavioral transformation that directly impacts users, creating significant and lasting effects on users’ emotional attachments to a product or service. Rather than the user having only a few dopamine spurts using a product on one somewhat consistently sloped-up-and-back-down journey, the users’ peak moments suddenly scale into many more mini-stages at a faster rate within the larger user-journey. Using behavioral data to pack a ton of mini peak-end stages into the larger journey impacts the rate and intensity of a user’s emotional response at the most fundamental level, tying that user to the addictive nature of the app. I like to call this the Rollercoaster Approach.

The Rollercoaster Approach is when machine-learning algorithms use behavioral data to exponentially prolong the duration of a session by dynamically inducing behavioral change at the slope of users’ emotionally pivotal moments.

Let’s look at what I mean by the Rollercoaster Approach using a graph. Normally content is introduced in slow, linear inclines and declines. The Rollercoaster Approach shows us how data can be strategically placed at the perfect moment after a low-point to reengage the user.

The Rollercoaster Approach

TikTok design strategists uncovered a game changer. Users’ final moments scrolling through most apps’ feeds normally come to a closure when they view a few blocks of content that are less interesting than the rest, even if all the content relates to a topic that is of high interest to the user. What did TikTok do differently? They took advantage of this “end moment,” analyzing their users’ behavior at this moment, what specific, not categorized, content made the most emotional impact by analyzing their level of interaction and time spent with it, and then placed the content that creates “peak moments” directly after content that creates “end moments.” It’s a great pavlovian conditioning strategy to keep you scrolling. If you were to just scroll through one category of content (i.e. tech or dog videos), most of the content would blur into one equivalent experience and all grow stale over time, with one or two peak-end stages, and you’d leave sooner. If more peak-end stages were peppered throughout the experience, pushing tailored, interesting and emotional content when other content gets boring, the overall experience will turn into a rollercoaster of ups and downs, causing the user to stay on for a longer period of time.

Let’s think about TikTok’s algorithm using momma’s cooking. You live out of town and go stay at your parents’ house for two weeks visiting. Your mother asks you what type of food you’ve been into lately. You tell her you’ve grown to love Mexican food and eat it regularly. Now mom knows to make Mexican food most nights, but not all. She also remembers that your favorite food is chicken shawarma and hummus so she buys the ingredients for that too. Her cooking schedule looks like:

  1. Monday: Salsa-Verde Enchiladas
  2. Tuesday: Beef Tacos
  3. Wednesday: Corn Salsa Enchiladas
  4. Thursday: Eat out @ Los Amigos Fiesta
  5. Friday: Chicken Burritos
  6. Saturday: Tamales
  7. Sunday: Chilaquiles
  8. Monday-Sunday: TBD

She seeks feedback from you every night after dinner: “How was the food?” You tell her what you thought of it on a rate of 0–10, blah to great (by the way, don’t ever do this, unless your answer is always 10). Each night, your mom writes down your rating on her cooking schedule. By the end of both weeks, your responses looked like:

  1. Monday: Salsa-Verde Enchiladas (8)
  2. Tuesday: Beef Tacos (7)
  3. Wednesday: Corn Salsa Enchiladas (9)
  4. Thursday: Eat out @ Los Amigos Fiesta (6)
  5. Friday: Chicken Burritos (8)
  6. Saturday: Tamales (7)
  7. Sunday: Chilaquiles (3)
  8. Monday: Corn Salsa Enchiladas (9)
  9. Tuesday: Quesadillas (8)
  10. Wednesday: Carnitas & Beans (1)
  11. Thursday: Chicken Shawarma & Hummus (10)
  12. Friday: Chicken Burritos (8)
  13. Saturday: Chicken Shawarma & Grape Leaves (8)
  14. Sunday: Beef Tacos (7)

Mom figured out that when she made a meal you didn’t like (like Chilaquiles), she’d just feed you something she knew you’d love the next day to keep you interested in Mexican food. This way she wouldn’t need to throw out all the planned-out meals in the fridge and have to buy different foods. So when this happened again to an even worse degree, she gave you something she knew you loved and hadn’t eaten in a while, on top of being completely different than Mexican food. This created excitement for something different, but familiar. If you were to stay a third and fourth week, she’d just buy a bunch of middle-eastern food because she knows you’re somewhat tired of Mexican food but is loving this new introduction of middle-eastern dishes.

The momma’s cooking example is over the course of two weeks, but on TikTok, this could span over the course of 20 minutes. Imagine Mexican food being the category you start looking at videos in. Consider the first planned week of meals to equal TikTok’s pre-loaded bridge-jumping videos waiting for you to see how you interact with it. The second week (TBD), will be tailored based on the choices you make during use. When you stop interacting and begin to scroll quickly—your subconscious desperately hunting for interesting content—TikTok will throw in some more highly viewed or personalized bridge-jumping videos. If that’s not enough, the app will throw in content it knows you love but haven’t seen in a while, creating the excitement of something different, but nostalgically exciting at the same time. If a video of guys jumping off a bridge seems like an “end moment,” TikTok will just queue up some cat videos next because it knows you love those. Fading out of the bridge-jumping videos and into the cat videos, now it has you again. You’ve crossed through numerous peak-end moments by the time you’ve subconsciously given into the decision that TikTok made for you to entirely change what you are viewing. This happens so many times that you probably don’t even remember what you started looking at when you hopped on—Mexican food. “Wait Mexican food was from the other example,…or was it.” BAM. Hooked.

I encourage you to check out growth.design’s explanation of TikTok’s data-informing rollercoaster algorithm that changes your initial decision of simply checking out the app briefly into the addictive desire for more content to keep you scrolling. Most of us are so excited to try out new apps and products, but sometimes it’s dangerous. Ruha Benjamin explains that to fight the behavorial and social alteration of our viewpoints and habits, our antennae should pop up to question what all the hype around “better, faster, fairer” may be hiding when we hear promises of extolling technological experiences and giving away more of our data in using highly manipulative products. We must be cautious and responsible.

Collecting users’ emotions that are tied to specific and systemic technological habits reveal deeply integral information about users. This data can be used to tailor content to your liking, but can also be used to emotionally manipulate your ability to make decisions and act on them, making decisions for you that you would have otherwise not chosen for. Here in Hooked Users’ Anonymous, we call that addiction. Machine-learning algorithms collect users’ behavioral information to put emotionally captivating content in front of the same user for the most immersive locations at the most vulnerable moments for the most satisfying lengths of time, making the difference between a user’s churn and conversion. When used at the perfect moment across how, when, and where content is placed, companies can use data to spike their users’ emotions (usually delight/joy), causing these users to build new neurological pathways at an unprecedented rate compared to when not applying data at the right moment. Guess what that does folks? We get addicted. My dopamine levels spike and I want more content because nothing else matters right now.

--

--