Who says you can’t buy love? Technology that can guide and shape our most intimate relationships is on the horizon, with simulations and robotics amongst them. In the near future, AI, augmented reality, wearables, and digital environments will also change the way we meet, interact, and end our relationships.
But do we understand all the implications love tech will have? Technology is a two-way street. When we interact with it, it interacts with us. We often don’t stop to consider what the entity on the other of end of that relationship might take from us in the process. The widespread systemic consequences of personal convenience are not always evident. The unintended consequences and uses of any given technology can be difficult to predict. As a result, we might be ill-prepared for future possibilities that impact our everyday lives and our society. The future of love is no exception to that.
To explore what the near future may hold, I created a set of four commercials from the future featuring ‘Eros Labs’, a fictional love tech company from 2024. The intention was to demonstrate and understand how seemingly desirable technology (parts one to three) could have widespread systemic consequences (part four).
So, how might a future of love play out?
Let’s start at the beginning. How might technology allow us to meet the ‘right’ partner? What will access to love tech do to our sense of judgement, values, and what we perceive as important? And what if we could navigate our way through tricky situations with more ease? It’s easy to imagine the upside of Love Maps, but they may have negative consequences too. For instance, someone could use this same technology to gaslight their partner. What if someone hacks into the software and uses it exploit or blackmail you? Less sinister, what is the role of mystery and novelty in love, and is that worth preserving?
Potential Alternative Uses
- HR: Companies might find recruitment, employee engagement, and mentoring/training easier with such tech, but do we want them to have access to our personal selves?
- Personalized education: can we navigate how to best educate people and prepare them for the future of work?
- Augmented parenting: will this type of technology enable hyper-competitive and/or data-driven parenting?
Can you fake it until you make it? Clues about whether or not a relationship is working are peppered in details like the tone of your voice and the words you use, and artificial intelligence may be able to identify and/or predict whether or not things are headed south.
We know that relationships require work, and a little nudge in the right direction might be helpful. If our environment conspires with our possessions and our biology to direct us to a particular outcome, what happens to our perception of reality? Such technology should raise concerns about who or what triggers the manufacturing process and how far that process could go. On the flip side, this same technology that perpetuates love might keep an abusive relationship going, allow for mass manipulation, and could be subject to hijacking.
Potential Alternative Uses
- Programmed public spaces: can digital environments manufacture Orwellian-styled love for politicians during elections and thereafter?
- Institutionalized love: how about loving environments in schools, religious institutions, retail locations, or your workplace? Is this a future of branding?
- Entertainment: what would happen to concerts, festivals, and sporting events if they could manufacture love and/or hate?
Planned Emotional Obsolescence
Is it better to have loved and lost without a sense of loss? Some of us might choose to avoid heartbreak altogether if we had the option. Our messy emotions are a fundamental part of the human experience. With negative emotions removed, would that experience still be a human one? How would this affect our sense of empathy and altruism? What would be the socio-economic and political drawbacks if we all fell out of love with someone or something at once?
Potential Alternative Uses
- Pre-planned emotional obsolescence: can we desensitize people to situations before they occur? What if this application is used by the military or in therapy?
- Economic growth: what if we were subjected to manufactured fads on a cycle that stimulated the economy when needed?
- Workforce planning: layoffs, attritions, and mergers might be easier to undertake but would organizations trigger change with consent?
Not so cute anymore, is it? The future depicted throughout the four videos is within our grasp. It is full of privacy issues, moral dilemmas, and systemic consequences. If this possible future concerns you, remember that a version of ‘Eros Labs’ exists today. Facebook is no better. The company has knowingly and willingly undermined democracy, and it is the time we spend on that platform that empowers them to do so. Yet, how many of us continue to put our personal need for convenience and connection ahead of our collective need for democracy?
If a social media platform can allow a foreign power to the manipulate U.S. elections, imagine what Deepfakes can do. The alternative uses of reality-bending Deepfake filters are extensive. Those in power could undermine truth and our perception of reality, which will then have far-reaching socio-economic, political, and environmental consequences. Worse, that power could be wielded by anyone who can create Deepfakes — individuals who have not been democratically elected and/or might be serving an undesirable agenda. There is little policy and regulation that can prevent or combat this possibility. That is a real danger.
Even if a technology is created for good or intended to be neutral, others may find uses we haven’t thought of. As adoption of the internet of things, wearables, smart homes, and AI becomes more commonplace, so will the potential for abuse. It’s important to keep in mind that just because we can design something, doesn’t mean we should. How do we ensure we are designing for collective good when policy lags behind innovation and unchecked entrepreneurship races ahead?
For now, we need to challenge ourselves and each other about the future we are creating, and how our desire for convenience and connection may be fueling undesirable collective outcomes. Exploring possible use cases through stories, scenarios, and experiential futures allows us to tease out what some of those consequences may be. By understanding the possibilities, we can begin to design ways to protect people — as individuals and as a society — against the adverse effects of convenient technology.
Even if we can buy love, doesn’t mean we should do so blindly.