The following is a repost from a newsletter I just launched. It aims to help organizations shape emerging technologies with society, planet, and the long-term in mind.
W e face a systemic issue when it comes to innovation: the development of new technologies is currently outpacing our ability to foresee, understand, and mitigate their negative implications. In the stead of promised techno-utopias we are often left grappling with the undesired, unintended consequences of entrenched technologies.
This is certainly not a new issue. In 1932, H.G. Wells lamented the dearth of dedicated people to study the future consequences of new inventions. In 1958, Aldous Huxley warned us about the consequences of our current mode of innovation:
“We mustn’t be caught by surprise by our own advancing technology. This has happened again and again in history with technology’s advance… people have found themselves in situations which they didn’t foresee and doing all sorts of things they didn’t want to do.”
Much of this issue is rooted in our cultural and institutional reverence for innovation (or, more specifically, technology). We want change. We want to make things better; to open new possibilities; to expand the scope of imagination. We celebrate and incentivize extending the technological frontier.
Our cultural inclination for innovation can seen in the stump speeches on the right and the left, in the idolization of entrepreneurs, and forms of social, aural, visual media that promote ‘the hustle.’
Our political economy is one where modern states largely delegate innovation to individuals acting on their own accord: taking risks, making things, and releasing them into the world. States grant limited liability to corporations to cap downside risk, and kindle innovation through a range of programs and incentive schemes.
While is much to appreciate about the outcomes that this model has brought about, the process of giving creators of technology tremendous power to influence the lives of others — without requiring them to fully account for the harms and risks — has created an asymmetric system of creation where technology generates short-term solutions and long-term problems.
If we are to survive technology (as John Von Neumann put it), our organizations need to take a more contemplative, farsighted approach to commercializing innovation.
This newsletter is intended to help technologists critically reflect upon, and design for, how their technology impacts society over the long-term. It shares some thoughts related to my research on responsible innovation, which draws from academic literature from science and technology studies (STS), anthropology, sociology, philosophy, management science, and design.
Whether you are a developer, designer, strategist, manager, executive, social activist, or passing-by reader, I hope that these writings instil a mindful perspective towards technology. The resources may raise more questions than they answer, but I hope they might expand the set of possible futures you may work towards, accentuate the legacy of your technologies, and avoid an entrenchment of outcomes that we all would be best to avoid.
With that said, let’s kick off the newsletter by exploring how the process of creating technology relates to the story of Frankenstein.
No, not that one.
Mary Shelley’s horror story, Frankenstein, Or, the Modern Prometheus, is commonly understood as a tale about technological crimes against nature. Frankenstein represented a Pandora’s box that should not have opened.
In his essay, “Love Your Monsters,” the French sociologist Bruno Latour used this story to analogize our misdirected attitudes towards technology. We often mistake Dr. Frankenstein with the unnamed “monster” he created, and we mistake the real lesson of the story.
Frankenstein’s crime was not that he invented a creature through his relentless ingenuity, but that he did so with limited consideration of the consequences and commitment to how his creation should relate to the world. Frankenstein abandons his creation. On the Alpine glaciers, the monster claims he was not born a monster, but one created through abandonment:
“I ought to be thy Adam; but I am rather the fallen angel whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”
Dr. Frankenstein was, arguably, the monster.
202 years after Frankenstein was published, the plot of innovation remains largely unchanged. Our society incentivizes us to commercialize ideas before we have time to critically assess their implications (and even if they are worth doing in the first place). When our products reach their markets, they become entrenched in society. We have a hard time changing or reversing our creations — even if their harms become clear, and especially when a company goes public. We often flee, as Dr. Frankenstein did, from what we create. Those with an urge to create face the uncomfortable reality that what they intend to do might bring about more than they intended.
If we are truly committed to technological innovation, perhaps we must commit ourselves to loving our machines. That is, to commit ourselves to putting deliberate thought into the consequences of our creations, and caring for our creations from cradle to grave.
We are like gods insofar that we create things that influence the lives of others, and the planet on which we live. Regardless of your role, there is a role you can play in demonstrating responsibility for the things you help create.
The next editions in this series will provide you with practical new frames of thinking to do exactly that. We’ll be diving into topics like the politics of technology, philosophies of recasting matter, critical design, the communications landscape, space colonization, and reincorporating history and foresight into our present-day decision making.
Some housekeeping ~
Thanks for reading the inaugural newsletter! It’s still a work in progress. Let me know if there are any topics you’d like to see.
If you or someone you know is working on an emerging technology and has an interest in contemplating its long-term implications to society, send me a note. My current work aims to improve this process through academic research.
Sign up here for the 2100 newsletter: