A teenager in 1970s Calcutta
Getting and spending, we lay waste our powers. I must have been thirteen or fourteen when I first read those lines, and they intrigued me greatly. The Calcutta of 1971 was a Calcutta where materialism did not sit well, in an India where materialism did not sit well. I’d never left Indian shores before then, and all I knew of the world beyond was that which could be absorbed from books and magazines and radio and the occasional hippie-trail traveller. We’d begun to hear about climate change: those days the talk was of the impending Ice Age. We’d begun to hear about things ecological: articles about Rachel Carson and Silent Spring had been around for a while; concerns about DDT and thalidomide were entering our consciousness; the Beatles had met the Maharishi; Indian religious beliefs and practices, dating from prechristian times, had somehow earned the sobriquet “New Age”; smoking was de rigour (though the substance involved wasn’t always tobacco) and anti-war sentiment, interleaved with anti-nuclear feelings, pervaded the music of the time, or at least that segment I used to listen to.
Immersed in that zeitgeist, I found Wordsworth’s poem fascinating. Of particular interest to the early teenage me was the notion that the act of “acquiring” something could somehow become a source of weakness. It made sense in an I-Am-A-Rock way. And a rock feels no pain. And an island never cries.
The Calcutta I grew up in hadn’t really gotten over not being the capital of India, an event that took place six decades earlier; it hadn’t really gotten over the impact of the terrible famine of 1943 (3–4 million dead out of a population of around 60 million), exacerbated by Churchill’s decision to divert most of the local food supplies to the war effort; it hadn’t really gotten over the decision by the British Government to “partition” India, and Bengal with it, as part of independence. [Earlier attempts to partition the state along religious lines had failed but were then resuscitated over three decades later].
When you add to all this two facts — — that I spent my childhood and youth studying in a Jesuit school and college, and that the city and state I grew up in was one whose legislature had significant proportions of democratically elected communist party members — — you get an idea of what growing up in Calcutta at the time meant.
Seminal events: 1971, 1975, 1977
The whole city had a stoic stiff-upper-lip-ness about it as a result of all this, and I was no exception. There was nothing paranoid or fatalistic about it; instead, there was a steely resolve to push back against any form of authoritarian central government, with gritted teeth, leavened with a strong sense of humour and a mustn’t-grumble mindset.
Then came the 1971 war with Pakistan, a battle that led to the formation of Bangladesh. Millions of refugees crossed the border into India, mainly into West Bengal, and a lot of them found their way in to Calcutta, perhaps as many as half of the eight or nine million who escaped the war zone.
The Calcuttan’s response? Joi Bangla. [In characteristic dark humour, the conjunctivitis outbreak that came with the influx of refugees was named after the independence slogan of the emergent nation]. Many of the refugees returned to Bangladesh after independence, some stayed, all were welcomed.
The running battles between the Congress, the CPI-M(L) and the Naxalites that had punctuated the previous decade of strikes (the “hartals”) and the imposition of “President’s Rule” in Bengal then culminated in the dark days of Mrs Gandhi’s Emergency. Central heavy-handed rule, the one thing that, given its history, Calcutta could not stomach, and so it really felt the pain.
Twenty-one months later, democracy showed up. The Opposition, hitherto jailed during the Emergency, released and asked to participate in a snap General Election. Assumption? The only party that had ever won any election in India would win again, especially since the Opposition had been otherwise occupied in the lead-up to the vote. The Opposition cries Fix. But their cries are in vain. Assumption proves wrong. They win. Democracy at a scale never seen before. Oh frabjous day. Hope springs.
Green shoots: The Internet, the Grateful Dead, the WELL, the Web
My late adolescence and early adulthood continued on this vein of unalloyed hope, as tools of the “architecture of participation” emerged apace. I may have been born too late to be a true pinko-lefty-tree-hugger hippie, but it didn’t stop me trying. I could never shake off my deep-seated belief that true freedom came from access to education, access to information, access to the ability to deal with information critically.
Wherever I looked I saw hope. Jerry Garcia’s “Once we’re done with it, it’s theirs”; the taping rows at Dead concerts; Stewart Brand, the Whole Earth Catalog, the WELL and its YOYOW; Tim Berners-Lee, the Web and 404; Howard Rheingold, virtual communities, crap detection; Steven Johnson and emergence; the rise of maker culture; Sugata Mitra and minimally invasive education; MOOCs and EdX and Udacity and Coursera and Khan Academy and and and. The Cluetrain Manifesto. Wherever I looked I saw hope.
Unfettered affordable global access to education, to information, to the ability to engage with information critically. Wherever I looked I saw hope.
Fast forward to today
Globalisation, a flattening world, changes in public policy, Moore’s Law, the continuing adventures of digital infrastructure rollout, the Big Shift. Threats to historical business models, to-be-expected immune system responses. It’s a joke. It doesn’t work. It’s evil and UnAmerican. And (preferably after it’s been rendered toothless) it’s the Right Thing. First they ignore you, then they laugh at you, then they fight you. And then you win.
Except that sometimes it’s And Then They Win. It almost happened with open source. It hasn’t quite happened with cloud, but it still might. It looks like it might happen with smart devices, as the tight control of the App world prepares the ground for digital-access-only constructions.
Globalisation is not alone in having discontents. Connectedness and cyberspace have similar critics. Governments and politicians the world over want snooper’s charters and kill switches and draconian rules over all this, walling their gardens physically and digitally. Perhaps George Lakoff is right, and there’s a strict-father versus nurturing-parent battle of mindsets and worldviews going on.
What keeps me up nights
Those who know me would also know I tend to stay positive whatever my circumstances, and that positive outlook frames my view of humanity and of life. And yet. As I grow older, with every passing day I learn how little I know. Guns versus butter. Sugar versus fat. The energy and water costs of eating meat. Can analysis be worthwhile? Is the theatre really dead? My Baconian doubt-and-certainty instinct strengthens. My interest in evonomics increases.
A digital world can open up many possibilities, can enfranchise billions, can level playing fields and democratise access.
Not necessarily will.
When I was young my grandfather said that mine could be the generation of peak longevity, as he saw the decline in habits to do with food, exercise and rest. As I grew older, I learnt something about the challenges the pharmaceutical industry faced, in terms of business models and patents, and how that could affect the efficacy of their product. I learnt a little bit about the death of cities at the same time that everyone was moving to the city. I learnt something about code and the laws of cyberspace and the challenges that world represented. I spent time reading, amongst others, Jane Jacobs, Christopher Alexander, David Agus, Geoffrey West, Larry Lessig, Bruce Schneier, Cory Doctorow et al.
Over sixteen years ago, Larry Lessig said:
The code of cyberspace is changing. And as this code changes, the character of cyberspace will change as well. Cyberspace will change from a place that protects anonymity, free speech, and individual control, to a place that makes anonymity harder, speech less free, and individual control the province of individual experts only.
Anonymity harder. Speech less free. Individual control the province of individual experts only.
Some years ago I was taken ill while in San Francisco, life-threatening enough for my wife and family to be alerted and asked to fly to join me. I managed to convince my wife that only she should travel. She found herself unable to board the plane: her visa had run out. Stressful. But I had my iPad, the hospital had wireless, and soon she had a valid ESTA.
Sometime after that, after a hectic bout of business travel, I found myself in a bit of a tight spot. While in the immigration queue at an European airport, I realised I’d left the passport that had my Schengen visa on my desk at home. I have only one passport, but have valid visas in “expired” passports where I’ve run out of stampable pages. I’d forgotten to bring the particular booklet I needed.
When I got to the counter and explained what had happened, things looked bleak. No visa no entry. I tried everything I could think of. Could a record of my Schengen be found if I told them where and when it had been issued? Nope, not unless it was issued by the country I was visiting (it wasn’t). Could they let me in on the good-behaviour basis of the number of entry and exit stamps for that country in my passport? Nope. What if I called home, got someone to take a photo of the visa in the passport I’d left behind, and had that emailed to me and made visible on my phone? <grinding gears of thought>. Maybe. But ask her to send it to this email address, not yours. They let me in.
Human being and not algorithm.
Another time, I was due to go on holiday with my wife and my youngest child. A short break, Paris in the springtime. They were at the airport. Waiting for me, wondering where I was. I was crawling on the M4 still miles from the airport. I’d checked in already, hand baggage only, while they had hold baggage, so I asked them to proceed airside and to wait for me. It looked touch-and-go as to whether I would make it on time, so much so I told them to go without me if needed, I would follow. And then I got a break: the flight was delayed by more than an hour. I was home free. Or so I thought. I get there, saunter up to the security gates, flash my on-phone boarding card. Denied access. Why? Because I was no longer within 40 minutes of the scheduled take-off time of the flight, no matter that I was nearly a hundred minutes ahead of the current departure time.
Sorry, the system says you can’t. Take me to your leader. Sorry, the system says you can’t. At some point I was in front of a human being who was empowered to say something other than sorry the system says you can’t. And I made the flight.
Having systematic rules and algorithms is in itself not a bad thing. The bad thing is when there is no manual override, no escalation path, no ability to deal with exceptions. Putting some friction into being able to get to the override or exceptional process is reasonable. But there must be one.
As more and more people get connected, and as the quantum of information “exhaust” grows, the need for filters increases. As Clay Shirky said, there’s no such thing as information overload, only filter failure. So it’s reasonable to design filters. What’s problematic is when the filters are publisher-side and not subscriber-side. That allows bad actors to become strong censors. A bad thing, as people like Eli Pariser reminded us.
Designing to protect individuals’ control rights
The trigger for today’s post was the decision by Apple to remove the headphone jack from their phones. I understand the rationale for the decision: camera, battery life, device size, the works. Lots of good things. But with a cost.
A cost that affects individual control.
DMCA. How mobile networks work. DRM. A world of app-only access. Increased sealing of devices, the Apple decision being just the latest brick in the wall.
Convenience cannot itself become a flag of convenience. The impetus to trade control and privacy for security is understandable but has appalling consequences.
Many years ago I spoke about the need to “design for loss of control”. Now I think more and more about “designing to protect the control rights of individuals”.