Dynamic Pricing Algorithms Need to be More Transparent, Equitable, and Human-Centered.

Yatin Bhat
SI 410: Ethics and Information Technology
7 min readFeb 17, 2023

Have you ever noticed the price of a t-shirt change since the last time you purchased it? Have you and a friend tried to purchase airline tickets online, but had different prices appear on each other’s screens? In the age of the Internet, the prices of everything from Taylor Swift’s concert tickets to Uber rides are constantly changing, and it’s not always clear to the average consumer why this occurs. These are all examples of dynamic pricing algorithms: companies adjusting prices for items based on real time market conditions. Dynamic pricing needs major improvements in algorithm transparency, equity measures, and human-override controls to solve the technology’s ethical issues.

Dynamic pricing algorithms can use information like supply and demand, competitors’ prices, customer demographics, personal preferences, and seasonal factors to change the prices that consumers see. Dynamic pricing allows companies like Amazon to make constant adjustments to prices, which can help sellers make more money. Customers also get some benefits from companies’ algorithms: personalized recommendations, seasonal sales, early-bird discounts, and more. However, diving deeper into how dynamic pricing works reveals numerous ethical issues that need to be improved upon.

Dynamic pricing algorithms can maximize revenue by charging customers exactly what they’re willing to pay.

The workings of dynamic pricing algorithms are rarely visible to customers, which creates major concerns about data collection and privacy. As a customer, information like your age, gender, location, and personal preferences are often collected without your knowledge — this data is directly used to alter prices and generate large profits for companies. Companies can even hide information that could benefit customers, such as price comparisons. When I’m online shopping, I’m always been surprised when third-party online coupon applications like Honey can find me identical products on other websites, or even coupon codes on the same site itself. It’s interesting to note that most companies have zero incentive to inform me of better alternatives — in fact, that’s probably the last thing they want to do!

Davis & Chouinard discuss how technological artifacts “request, demand, allow, encourage, discourage, and refuse” individuals to do certain things. Dynamic pricing algorithms usually “refuse” to reveal their workings to customers, which gives companies immense power and control. At the same time, dynamic pricing often “encourages” customers to spend money by showing them products that are enticing to them. I believe that companies need to better inform users about the variables that are used in their algorithms. If my personal information is being collected to change the prices I see, then I deserve to understand the algorithms that are being used.

Airbnb is a great example of a company using dynamic pricing more transparently. Airbnb has a detailed guide to “Smart Pricing” that describes how daily trends and seasonal shifts can change prices — hosts are able to decide whether they want Smart Pricing on or off. However, Airbnb’s increased transparency is probably because they gain greater profits when Airbnb hosts use dynamic pricing. When I book vacations as an Airbnb customer, I’ve noticed that Airbnb doesn’t offer much information about how to secure the best prices. For recommended booking “strategies”, I’ve had to consult third-party websites. Companies should be willing to disclose more about how their algorithms work to customers, even if it might reduce profits.

Airbnb makes it simple for hosts to turn on “Demand-based pricing” through their system.

Dynamic pricing algorithms can also cause major equity concerns in areas like economic status, race, and gender. One type of dynamic pricing is “segmented pricing,” which creates differing prices for customers based on categorization. However, segmented pricing can create inequities based on the categories that people are classified into. Although an algorithm can classify individuals based on certain “buckets”, there’s no guarantee that it considers specific details about an individual. For example, I’m a college student with minimal income, but live in an expensive college town. An algorithm might assume that I’m wealthy based on my zip code, but might not be able to discern my individual financial status.

Another example of an equity concern is seen when dynamic pricing causes the prices of specific items to spike because of a shortage — if a certain population depends on the item, they could be disproportionally affected in an unfair way. For example, let’s say that a specific children’s cough medicine was in shortage during flu season — a dynamic pricing algorithm would likely increase the price. Even though the algorithm wasn’t targeting children, it would indirectly harm that section of the population. During the height of the pandemic, I remember having to pay incredibly high prices for items like toilet paper, hand sanitizer, and masks. Especially since some of these items were important for health reasons, it was extremely stressful to have to face high costs for products that I consider to be essential items.

Author Geoffrey Bowker discusses how “… we stand for the most part in formal ignorance of the social and moral order created by these invisible, potent entities”. The “invisible, potent entities” are classifications — dynamic pricing algorithms can create a “social and moral order” that is largely invisible to customers. It’s also difficult to legally prove that companies actually do this because people must prove that a certain outcome was specifically due to discrimination — this is not an easy thing to do. As a result, the “burden of proof often allows programmers and corporations to shirk responsibility and repercussions”. Without professional legal help, I would have zero clue how to confront a company that I felt discriminated by. Even then, it’s unlikely that I would be able to gather the funds and evidence required to argue my case. Arguing that data is collected in an “apolitical way” is not a valid approach, especially if current dynamic pricing methods clearly have inherent biases. Companies should still be allowed to use customer information in pricing strategies, but need to consider equity concerns in the process. This can happen in a variety of ways: feedback surveys, research studies, increased transparency, and more. By considering the potential equity impacts of dynamic pricing algorithms, companies will be able to create more fair and unbiased outcomes for customers.

Finally, dynamic pricing tools can often have very harmful impacts when there is a lack of human control measures that override the automated algorithms. This issue is most prevalent during emergencies like environmental disasters, terrorist attacks, pandemics, and more. As a result of these events, supply and demand often swings drastically. For example, demand for face masks spiked during the pandemic — people wanted them for health reasons, but dynamic pricing resulted in exorbitant costs that were unaffordable for many. Dynamic pricing treats these events like any other change in supply and demand, and there is typically no default method to override these algorithms. As a result, factors like public health and safety are not accounted for — this is extremely dangerous to consumers. Uber’s “Peak Pricing” feature implements higher prices during times of high demand — this tool has historically had harmful impacts during public emergencies because of dynamic pricing. Uber fares increased as much as 500% during a 2016 bombing in New York City and a 2017 terrorist attack in London. In both these scenarios, people were unable to get to safety because of incredibly high prices. In addition, there was no immediate way for Uber engineers to override the system and bring prices down. During major snowstorms in the area that I live, I have seen friends struggle to get home from places like the airport because of high Uber fees. I would argue that Uber should prioritize the safety of customers over profits in these scenarios.

An image displaying what Uber’s map UI looks like during “Surge Pricing”

A study by Harvard Business Review describes how because dynamic pricing focuses on “economic efficiency,” it often ends up “[blocking] access to essential needs, magnifying the possibility of societal harms”. According to Davis & Chouinard’s paper, technology “[discourages] when one line of action, though available should subjects wish to pursue it, is only accessible through concerted effort”. Dynamic pricing tools “discourage” human intervention in algorithms — although it’s possible for companies to intervene, the process is often complicated and only used as a last resort. Companies need to make it simpler to override algorithms when the time arises — when people’s health and lives are at stake, this could be a critical improvement. This could be done in a variety of ways, like companies creating a department to manage crisis response. As a consumer, I’d appreciate “Emergency” buttons within apps, or even seeing companies be more responsive to public demand in general.

Lucian Floridi discusses how “Our technological tree has been growing its far-reaching branches much more widely, rapidly, and chaotically than its conceptual, ethical, and cultural roots”. This concept directly applies to the growth of dynamic pricing — the technology’s growth has been rapid and expansive, but has failed to fully account for ethical issues. Over the past couple decades, dynamic pricing has emerged as a powerful tool in industries like advertisement, sports, entertainment, and e-commerce. As a pricing method, it has reshaped the way that companies are able to interact with customers by using data and algorithms. However, business incentives of increased profits and customer personalization are often prioritized over major ethical concerns that arise with the technology. Specifically, companies need to invest in improving algorithm transparency, equity measures, and human-override controls.

--

--