Re-engineering humanity and rethinking digital networked tools. A conversation with Prof. Brett Frischmann.

Tim Richmond
SingularityNET
Published in
7 min readAug 2, 2019

We become what we behold. We shape our tools and then our tools shape us. (John Culkin, 1967)

Listen to the AGI podcast episode on:
Spotify| iTunes | Google Play | Stitcher | SingularityNET

Introduction

Since Prometheus’ gift of fire to humankind, humans have been using it as a tool to adapt to their environment and ultimately adapt the environment to themselves. Yet, from contract law, to media, to the roads we create, human beings have also always been shaped by their very own tools. A set of foreseeable and unforeseeable consequences on the way people develop, learn, interact, or build relationships tends to manifest with ubiquitous tools. This is a rather obvious observation but an important one to make in order to contextualise the way that modern digital networked tools have affected people in the information age.

In this month’s AGI podcast, we were honoured to receive and converse with Professor Brett Frischmann who recently wrote, along with his colleague Professor Evan Selinger, the book Re-Engineering Humanity. Much of the podcast’s discussion touches on subjects that the book covers in-depth and with a refreshing level of optimism despite the harsh reality, it unveils.

The guest, Brett Frischmann, is the Charles Widger Endowed University Professor in Law, Business and Economics at Villanova University. He is also an Affiliate Scholar of the Center for Internet and Society at Stanford Law School and a Trustee for the Nexa Center for Internet & Society in Torino, Italy. More importantly, Prof. Frischmann has researched extensively on knowledge commons, the social value of shared resources and techno-social engineering of humans (the relationships between the techno-social world and humanity). These subjects have long been core to the vision and work of SingularityNET and it was an exciting opportunity to discuss them with such a knowledgeable guest.

Unveiling discrete but powerful influence

“Like the proverbial frogs in slowly warming water, we’re gradually being led to accept a world governed by supposedly smart tech.” writes Brett Frischmann.

It is increasingly difficult to distinguish between genuine desires and ones that were pushed on to us by recommender systems embedded in much of the online platforms we interact with. That is a problem when we know that much of these systems can either be fed unrepresentative data, reproduce the bias of their developers, involve a level of learning that implies they go beyond the rules integrated by a human operator and, alas, does not allow us to even understand why some decisions were made. Faced with these facts, we ask: how free are we really when using digital technology?

While it can be a refreshing experience to just leave digital technology aside from time to time, this should not be our sole solution to limiting the subliminal effects of tech on our mind.

The world we are building is essentially driven by a sense of “if we have enough data we are going to optimize for efficiency and productivity and everyone will be happy”. The reality is that these are adequate goals for machines, not humans.

Take the example of contract law for example. In theory, explained Prof. Frischmann in an interview for The Economist “contract law enables and ought to enable people, first, to exercise their will freely in pursuit of their own ends and, second, to relate to others freely in pursuit of cooperative ends. In practice, electronic contracting threatens autonomy and undermines the development of meaningful relationships built on trust. Optimised to minimise transaction costs, maximise efficiency, minimise deliberation, and engineer complacency, the electronic contracting architecture nudges people to click a button and behave like simple stimulus-response machines.”

Testing our programmability

As many of you will know, the Turing Test was designed by Alan Turin in 1950 to be a rudimentary way of determining whether or not a computer is “intelligent”. During our discussion, Prof. Frischmann introduced us to the Reverse Turing Test -a human-focused Turing Test.

Credit: Nicholas Carr

How do you identify or evaluate when technology is dehumanising people? There are no good tools at the moment that evaluate, let alone quantify that. But “what if we use simple machines (predictable, explainable) as a baseline against which to identify remarkable humans?” asks prof. Frischmann.

For a human being in a heavily engineered environment like Facebook, you need to be able to determine when they are being engineered to behave like simple machines. In an attempt to do so, the two authors of Re-engineering humanity started with testing “commonsensical thinking”. Can a human be mistaken for a machine when it comes to a series of commonsensical tasks? “That would raise a flag. You would want to know… has this human being lost something meaningful like their commonsensical capacity because of the environment that they are in?”. While the experiments have been conceptualised the two researchers are looking to apply them in the right environment and are looking for collaborators. Ultimately, four thinking capacities will be selected in their research: mathematical computation, random number generation, rationality/irrationality, and common sense.

Commons, Impartiality, Transparency

For humans to sustain their freedoms, other rules need to start becoming the norm in the way we design our digital systems. On that point, our guest echoed the message of SingularityNET in that we might want to design AI algorithms, for example, focused on human flourishing and similar benevolent goals; and disrupting the business model of some large centralized corporations.

Just imagine altering the economic incentives of online social media platforms so as to promote human flourishing. Imagine, says Brett Frischmann, a BBC like public social media, credibly ensuring users that they do not need to sell any personal data of their users; where the goal would be to provide a knowledge commons on top of the network. An independent commons-based system for social media. Naturally, many issues could arise from such a model but the point here is that we need to think in a radically different way about how to continue using all those digital tools -that are great and that we love- while making sure we drain them from the countless private incentives that make them counterproductive to human flourishing.

Typology of Infrastructure Resources (Frieschmann 2007).

We have to go back and look at society to find these non-profit, independent, public interest organisations such as libraries and maybe reproduce that model more effectively in the digital space. Especially for what is now critical communication tools like social media platforms. A decentralized platform might be an ideal set-up to ensure impartiality and public interest. For those of you reading and thinking they would like to dive deeper into the topic, Kevin Werbach’s latest book The Blockchain and the New Architecture of Trust is a good place to start.

Classification of Goods Based on Degree of Rivalrousness (Frischmann 2005).

Transparency in how systems are built, what datasets are used to capture a certain population, what generalizations are made by engineers about x group, increasing the explainability of learning algorithms’ outputs, even publicizing the development discussions between engineers of AI algorithms, is urgently needed. But as our guest explained, it may not be enough.

It is important to note, as Prof. Frischmann reminds us, that it is also too easy to point the finger at certain big companies when thinking about these issues. The problem is widespread and normalized. Thousands of small companies focus on syphoning personal data and creating convenient systems for their business and not for their users’ long term sake. Unfortunately, the millions of people that use these also allow for this wider system to prevail. There is a normative stronghold that needs to be broken apart.

The parallel with climate change is striking. A tragedy of the commons is seen in both. It is easy to blame big fossil fuel companies but we “bear the same responsibility”. To start resolving the issue “economically or politically, you have to do massive structural changes; intervene at multiple scales: macro, meso, micro. And adjust how individuals live their lives from day to day.” Similarly, in the digital age, we make countless decisions that on their own terms are rational and seem unproblematic but in reality result in outsourcing thinking and affect us in an unjustifiable way in the long run. To paraphrase Prof. Frischmann, climate change threatens our planet just as techno-social engineering threatens our humanity.

To conclude, as Nicholas Carr reminds us in the foreword of Re-Engineering Humanity “technological momentum as the historian Thomas Hughes called it, is a powerful force. It can pull us along mindlessly in its slipstream. Countering that force is possible, but it requires a conscious acceptance of responsibility for how technologies are designed and used. If we don’t accept that responsibility, we risk becoming a means to others’ ends”.

But don’t despair “nothing is inevitable… besides entropy.” (Frischmann).

How can you get involved?

To listen to other episodes of the AGI Podcast, please click here. SingularityNET has a passionate and talented community which you can connect with by visiting our Community Forum. Feel free to say hello by introducing yourself here.

Exchange ideas, chat with others and share your knowledge with our community members on our forum. We have now launched the #AGICHAT, and we invite you to participate in our themed discussions. Read more about #AGICHAT here.

We are proud of our developers and researchers that are actively publishing their research for the benefit of the community; you can read the research here.

For any additional information, please refer to our new roadmap and subscribe to our newsletter to stay informed about all of our developments.

--

--