Overgrowth of the Digital Economy (and what to do about it)

Emaline Friedman
HOLO
Published in
8 min readNov 17, 2017

Working in tech is emotionally complex. I frequently find myself wondering at what point a “digital tool” moves from ethical neutrality to deliberately aiding the pursuit of particular social and political visions. Each time this arises in my mind, I return to the same point again and again. Perhaps I have a new mantra: It’s not about the tool. It’s about us.

Agency, in context, is everything.

The digital economy grows at a barreling speed, not least because it’s easy for our on-screen reality to seem both endlessly plentiful and eerily unrelated to the material and ecological realities of today’s world. Viewed all together, the five major tech companies grossed an unfathomable $950 billion in just the last ten months. For Google and Facebook, most profits come from ad revenue, whereas for Amazon it comes from infrastructure and data analytic services. The economic model of these global behemoths is so successful, in part, because it appeals to the need of enterprises and companies to know, organize, and coordinate…us! The lesson is that the information is there, it is valuable (knowledge is power, right?), and if we don’t coordinate ourselves, some other entity will.

Stop taking the fun out of my Apple watch!

We’re used to hearing about how user interface design affects us — after all, the point is to keep us on the page, to seduce us, to provide us a tidy little space where we feel invited to browse and click whimsically. Behind the scenes, too, privately owned algorithms sort us into categories like “expectant mother”, “at-risk youth”, “potential violent offender”, “occasional smoker”, “eco-activist”, “bipolar over-spender”, etc. All this is to say that front and back-end design governs more than just individual behavior — it creates imagined communities for us, and proceeds to make determinations based on those imaginary groups. It is really a whole system of prejudicial, and non-contextualized assumptions.

Why would we want to program our greatest faults, unconscious biases, (which are usually racist, sexist, classist…) into our computing systems? My guess is that it’s easier to control people when their darkest sources of shame are not just known, but preemptively taken for granted. Instead of having breathing room to think, grow, and adapt, we get enticed into repeating mistakes. We rightly perceive that we are set up to pursue old patterns. These powerful actors program into their equipment a brittle, static understanding of human beings that presumes and thus encourages a linear development on the terms envisioned by the system’s programmers (and constrained by its profit-imperative!) In brief, the more data the better, even if it’s inaccurate, unjust, or unwarranted.

I’m not saying these algorithms aren’t intelligent — of course they are. What I am saying is that they are incapable of wisdom. The wisdom to know when to give space. Intelligence algorithms are not designed to create membranes that encourage our creativity, to tell data brokers to wait for deeper understanding, to solicit a more nuanced account of something or other, or to advocate with and for us. So they’re not just incapable of wisdom, but of meaningful collaboration. We tell them everything, and they tell us nothing. Why have we resigned ourselves to such highly asymmetrical relationships?

Struggle Becomes Opportunity

Is it possible that ethical tech can pose a meaningful mode of resistance to this unsolicited governance “from above?” People don’t typically change their habits out of raw disdain for whatever’s going on, even if it’s awful. To take an extreme example of the most difficult: drug addictions. A little known fact in the recovery world is that what researchers call “spontaneous recovery” is the most statistically reliable form of recovery. What could top fancy retreat rehabs, you ask? “Spontaneous recovery” is just researchers’ code for “other good stuff happened in their life, the user kinda just started doing something else, and because it wasn’t a formally sanctioned treatment, we couldn’t really study what it was”.

As it turns out, there’s no formula for change. Only the belief in alternatives, as demonstrated in action. In simply doing otherwise. In recognizing that the temptation of the “old way” persists, but that agency to decide, in context, is far sweeter. They are two worlds that co-exist. It’s not as if we can turn back the clock and pretend like we never had such powerful means to coordinate via massive amounts of information. The key is to do so anew, without the commodification of cultural practices and one-sided relationships. Can we affirm a form of data ownership that’s not defined by property relations, but by engendered responsibility? Might that be a data commons?

Michel Bauwens and Jose Ramos write that…

“commons-based peer production creates new transnational forms…global productive communities and global generative market coalitions”.

The true love/hate messiness of social media, service platforms, and other web apps must be addressed honestly and in full to move forward the unequivocal benefits they afford and resist the equally undeniable violations. This is the perplexing nexus where convenience, safety, and community meet labor exploitation, surveillance, and the private sector’s subtle info-war launched against us all (sometimes called “neoliberalism”).

One source of inspiration for the engendered responsibility I’m gesturing toward could be the fact that we already contribute with overwhelming honesty, judiciousness, and care. We already take the time to help our Airbnb guests beyond what the app requires we do. We already write down our door’s entry code on a piece of paper for them, suddenly “forgetting” about our supposed security needs that the company’s server fulfills, in favor of doing what makes sense between people.

What if these services were managed and developed by participants themselves? Could these services truly merit the name “sharing economy” by increasing our expressive capacity to be heard and valued (by oneself and others)? Increase our sensory capacity to be aware and accepting of always-changing conditions? Bring to fruition the mutual aid that can grow from these capacities? I think the answer is a clear yes.

Yes this IS a real, meaningful way of being together.

Tech for a Data Commons?

In other words, we’re part of the way to a data commons. We’ve got the wherewithal and the drive for community. So where’s the technology? It’s already here, really. Any application can be designed to be “distributed”, and when you think about it, all applications can be their own, purpose-driven data commons. Without proprietary software directing the free contributions of users (and certainly not sharing them with the whole!), distributed apps can be an ecology of online spaces that together operate as a true commons.

Through architecture designed to distribute the power of ownership, management, and implementation across users, the apps are held by users themselves. An app is distributed because everyone who uses it holds a copy of its constitution, so to speak, its code, and the data the user generates by electing to follow these rules. We have a say in the terms of our collaborations. We can help guide an app’s development to fit emergent needs! Adding security and validation through peered methods, we get the following conditions for apps-as-commons:

(1) holding of a copy of application

(2) options for varying degrees of participation in community governance, starting with the above — simply electing to try out its coded rules to see whether or not it fits

(3) tried, and thus informed, choice of which data-coordination community(ies) we’re involved in, including the app in question and whatever other communities to which we decide to bring data generated elsewhere

(4) public validation of data that let’s us rest assured that our input is taken into account

Blockchain Won’t Cut It

The mantra of today’s crypto-crazed tech-world is to “decentralize everything!”. Many thinkers agree that the problems of corporate data-hoarding, designed-as-addicting web services, fake news, etc. are due, in large part, to the centralized architecture of most uses of the web. Which is to say, large, inaccessible servers host the transmission of data between people.

If you’ve heard of “blockchain” technology and the cryptocurrencies it supports, you may already be on the decentralization train. Such tools (for communication, accounting, and coordination) are not held by default through a central agency, be it that of restrictive governments, private companies, or whoever else. Great. That’s a sort of obvious solution, akin, perhaps, to a teenager who wants to run away from home to get mom and dad off their back. It may feel like the problem is solved, but distance from where power lives does not entail any meaningful empowerment that ensures what comes next won’t be even more pernicious.

We can’t just get far away from power, we have to be willing to take it, nonviolently, ourselves. And all the responsibility and accountability that comes with it. We could arrive at the same conclusion just by taking the words, “decentralization”, and “distribution” at face value. If we seek a less exploitative net, or greater possibilities for collaborating and coordinating, we can’t just decentralize the power I’ve written about. We have to distribute it, too.

So, while decentralized networks may very well be an evolutionary outgrowth of the way our society currently structures its data economy, “web 3.0”, but by no means solves the problems of web 2.0. The promises of blockchain-related technologies do not change the fundamental assumptions guiding our bad habits online. We simply cannot expect to automate away negotiating, sharing, and governing without creating wealth disparities and shifting focus away from fragile human needs best expressed through the “slow”, co-constructive process of being together.

We Invite you…

Come and join an online presentation and discussion on ethical technology. The theme is “Decentralization and You”, and it will take place on December 1st, 2017 at 1pm PST. We have limited spaces. Click here to save your seat!

During our time together you will be able to ground your understanding on how decentralization, cryptocurrencies, and distributed technologies can help us grow the commons, facilitate collaboration, and create true sharing economies!

Our format will be a fun mix of presentation and spaces for participation, discussion and sharing your ideas, experiences and projects.

Learn with renowned technologist, Arthur Brock, regenerative economies designer, Ferananda Ibarra, and yours truly, Emaline Friedman.

This webinar is sponsored by Holo

--

--

Emaline Friedman
HOLO
Writer for

autonomist marxist, psychoanalytic therapist, comms director @neighbour_hoods