“Culture of algorithms” merges the concept of relevance with the concept of personal gain. An algorithm does have the power to shape the culture

Giuseppe Granieri
Futurists’ Views
Published in
8 min readDec 19, 2014

--

A Conversation with Federico Badaloni

(#TheMakingOfaNewBook. Soon in italian, here)
Everything you need to know about
Federico Badaloni

How are technology and culture shaping each other?

Culture has more to do with the ability to identify needs, while the application of technology has more to do with the ability to find solutions.

Innovation often relies on the metaphors one culture is able to imagine.

Any metaphor is a “what if?” that leads to a better solution. For this reason, technology typically grows within cultures that value errors as well as trials.

This means that technology can’t advance in cultures that celebrate the “magical” powers associated to people who succeed without trying. The Italian culture is among those and I’m sure that the road to produce a better technology starts with celebrating those who achieve success through the shame of their own failures.

Although metaphors are important to improve, they can be a burden when it comes to finding practical applications for technology. A culture can be defined as a particular set of solutions that have been paired to a particular set of needs. This is true for laws, habits, beliefs, and also for technology. However, innovation can only happen when solutions and needs are clearly isolated. Take Henry Ford’s famous line: “If I had asked people what they wanted, they would have said faster horses”. This answer was not caused by a lack of imagination among people in the early 1900s. The reason is that Ford asked the wrong question: in order to understand the value of cylinders and pistons rather than horses, you first should be asked about your needs, not about solutions. In other words, you should be able to define needs in an abstract way, apart from the solutions that you are used to employing.

As culture has more to do with the ability to identify needs, while the application of technology has more to do with the ability to find solutions, the main problem for innovators is to frame the culture before shaping the technology. Often, innovators are isolated people, because they couple needs and solutions in a world where most people tend to focus on solutions.

How do you define the “culture of algorithms”?

“Culture of algorithms” merges the concept of relevance with the concept of personal gain.

Algorithm is a procedure that ensures one solution to a problem, or to a homogeneous class of them, in a given time. Hence, an algorithm is a technology in and of itself. It’s a cultural product, a need-solution pair.

However, an algorithm does have the power to shape the culture. It happens when it oversteps itself by viralizing its own effects. One of those kind of algorithms is the one that distributes content to people on Facebook. It has an effect in the culture because of the self-fulfilling power of the platform.

People sharing time, thoughts and emotions on Facebook don’t doubt that this is done for real. According to the well known theorem by W. I. Thomas and D. S. Thomas: “if men define situations as real, they are real in their consequences”. For this reason, an algorithm that selects and sorts what we see on our Facebook page has a deep influence on our behaviour, as it shapes our relationships with others. The influence of algorithms on culture — through the social behaviour — strongly calls for a coding ethics.

Let me come full circle: as we said, innovators frame the culture by shaping technology, but technology can shape culture in return, via its algorithms.

Merging the concept of relevance with the concept of personal gain, the “culture of algorithms” clashes with the “culture of humans”, where relevance and social advantages are the same thing. “Man is by nature a social animal”.

As it is, the most powerful algorithms (that is to say Google’s and Facebook’s) are good at making immediately available what we want to know. The real challenge of the “culture of humans” against the “culture of algorithms” lies in the ability to make available what we need to know as individuals, so that we can be aware of the social consequences of our actions.

Robots, artificial intelligence, algorithms: what have we to expect in the near future?

The looming risk is to be unarmed before an Internet made by big frameworks of algorithmic intelligence: systems which are able to feed and extend individual gains to the detriment of the social ones. This risk is deeply interconnected with the way the Internet redefines space and time. First we redefined our idea of space — think of the huge change produced by social media on the concept of “proximity”, or on what wearable technology is doing on the perception of our own body. Then the concept of time was deeply revolutionized, too. The “past” as a socially-transmitted individual memory has been redefined by the Internet into the notion of “what is available”, where availability is one of the forms of presence.

The way we value time was redefined, too, involving a change in the balance of power in the advertising market. Usually, we sit in front of the television waiting for our show to come up. The value of that particular show is in the wait, because it’s a shared — social — time. It aggregates people waiting for the same show so that attention and audience happen at the same time. Buyers place their advertisements when our attention is at the highest , purchasing a portion of our shared time. This is the way television works.

On the Internet, if we are interested in particular content, we just need to know where it is. The availability of content for everyone — as if in an eternal present — shifts the question from a social dimension to an individual one. The Internet changed waiting into searching. And this it makes Google happy (and Facebook too, in a little while).

The risk of a self-centered Internet is balanced by its possibility of connecting people. As Clay Shirky wrote, on one hand the Internet let us have hours of free time inconceivable just few years ago; on the other hand it offers the possibility to feed and reinvest the “cognitive surplus” that we gain with this free time. Nowadays this cycle is the main engine of the evolution of our society.

I don’t know what the future will hold, but I’m sure that if we want this evolution to be driven by the “culture of humans” rather than by the “culture of algorithms”, we’ll constantly need to feed the interoperability and the standardness of the Internet.

What do you think are the forces and trends that are driving the change?

One of the most powerful forces is the gratification that we feel when we find something. This is the reason why we first appreciated search engines, then the social networks’ “home delivery” of content. It’s the same force that now encourages us to put sensors inside objects and on our bodies, or to store huge amounts of data.

It is no longer possible to connect the multitude of information and to examine the amount of stored data without having appropriate “architectures of understanding”. This is the new frontier where the Information Architects are expending all their energy. “Where architects use forms and spaces to design environments for inhabitation, information architects use nodes and links to create environments for understanding”, writes Jorge Arango.

The direction that the change will take depends on our ability to design these architectures. More and more, change will be generated by designing systems able to react to external stimuli and produce new “cultural objects”. Andy Fitzgerald writes to this end: “forging narrative paths through nonlinear feedback systems will likely rely on proposing and testing hypotheses based on best-case heuristics as we understand them at the moment. Each solution will better inform the problem, which will in turn propose a more refined solution”. Traditional design processes that figure out new cultural objects and create systems around them will be increasingly worthless.

How do you envision the future of the Cultural Industry?

The Cultural Industry will have a chance if it will be able to overcome three challenges:

1. The challenge of designing relationships.At the very heart of culture there is the activity of structuring relationships.

Information makes sense when we understand the relationships in it, both at the micro level (as in the notes that form a chord) as well as at the macro level (as in the way we can recognize a song emerging from the sequence of chords). The Cultural Industry was used to express the “meaning” of information as the body of the space-time relationships that its single elements build with one another (as in a newspaper page or in a news broadcast schedule). Now, in this new “intertwingled” environment, the Cultural Industry has to learn how to express the “meaning” of information without using space or time but the intensity of the links between elements.

2. The challenge of designing meta-information.

In the digital era, technology no longer shapes the structure of communication. The structural characteristics are part of the communication in itself. Information has freed itself from the media in which it was once forced to “happen”. Information is “born digital” and then it takes different appearances depending on the different spaces and times of use on our devices. This polymorphism of information must be designed with a communicative aim. Otherwise, the uncertainty of the forms through which information manifests itself will inevitably cause a continuous change of meaning.

If the Cultural Industry will be able to identify and describe the structural elements of information, this will allow the various systems to properly recognize, extract and represent it. The medium is no longer the message. The message comes with its own wings; it no longer needs other means to fly.

Having control of the medium no longer means having control of the message. Control is held through meta-data.The core of the digital revolution is in this shift, because the collapse of the medium on the message is redefining both the production process as well as the business.

3. The challenge of relevance.

What if your “newspaper” had the ability to understand which information you most needed, how much time you have for it, the way you prefer to receive it, and it could create the right cluster of information that best fits you?

In this scenario the difficult part is not in understanding who you are, what is the context around you or how to adapt the information for your devices. The hard task, as I said before, is to understand what you really need to know. Algorithms are not good at this, but humans are. Not one algorithm or machine will ever be able to reveal what is beyond itself. We need humans to understand other human’s needs. We need humans to find and shape solutions that fulfill those needs in a way that human society can enhance.

Follow Federico Badaloni on Twitter: @fedebadaloni
Follow me on Twitter: @gg

Special thanks to @striphas.
Read:
Algorithmic culture. “Culture now has two audiences: people and machines” | A conversation with Ted Striphas

--

--

Giuseppe Granieri
Futurists’ Views

Predictive Thinking. Author of several books, columnist/contributor (L'Espresso, La Stampa), contract professor (Urbino University) | @gg | www.bookcafe.net