console.log(‘Hello World’)

Andrew A. Johnson
4 min readApr 13, 2019

--

If you’re reading this web development blog, you’ve probably already seen “Hello World” as the sort of baby’s-first-words of programming language. Recently, I’ve been thinking about “Hello World” a little differently, as the essentially human component at the heart of all technology.

Some necessary context: a close friend of mine passed away unexpectedly the other day. Managing that grief has been, well, unmanageable in many ways, but I’m grateful for the time I’ve taken to reflect upon his life and consider how I might be an instrument by which his most beautiful, most human qualities survive him.

Amongst other things, my friend was known for a particular (almost on-brand) wave that he would do in every photo of himself. In many regards, it wasn’t anything special–just a wave–though it became sort of iconic because he did it each time, without fail. For whatever reason, his wave seemed to pick up extra significance during the turbulence and division of the 2016 election cycle. Especially on digital platforms like Facebook & Twitter–where a certain amount of vitriol and hate becomes possible if you forget you’re interacting with other humans–my friend’s wave evoked a very human presence: “Hello World, it’s me; I’m human just like you.”

We often think of our various arrays of technology as somehow surplusses of what it means to be human. Critics of tech in some form or another often bemoan the human interactions we’ve lost because of transitions to social media platforms. Even a proponent for the digital era might say something like, “I spoke with her on FaceTime yesterday, but it’s been a while since we’ve talked in real life.” I understand these sentiments, of course, but I’m not convinced that our tech interfaces are less human by mere virtue of being digital.

Days before my friend passed away, he suffered a massive stroke which seemed to leave him capable of normal thought and the capacity to understand speech, but without the ability to pattern his own thoughts to sounds and vocal pronunciation. It meant that, had he lived, he would have been able to communicate–but not with his voice–relying instead on technologies such as the voice generator Stephen Hawking famously used, or with speech-capable keyboards made possible by the assistive tech industry. Or dependent, even, on the technology we know as the written word.

It’s that last one, the technology of the written word, I’ve been thinking the most about. Even the most tech-phobic amongst us would be unlikely to lump written language together with the technologies they typically scorn. But it is, absolutely, a technology. There is nothing essentially natural about written language: had it not been invented (variously and specifically for each language system), sustained and updated by use & re-use, and taught to children systematically, written language wouldn’t exist. There are, after all, thousands of languages that never had a written counterpart. And yet, the ability to read and write often seems to blend in with what it means to be human (often horrifically, I might add, with things like literacy tests to keep oppressed people from existing with self-governance and equal legal status).

All of which is to say, it’s worth remembering that technology is more than just iPhones and Facebook. Assistive tech sustains a voice for people who find themselves literally voiceless. Written & readable language extends our ability to communicate and to be understood beyond the simple distance we can shout and hear.

Alan Turing, famously one of the trailblazers of computer science and artificial intelligence, was often baffled by people’s responses to the so-called Turing Test. In its plainest articulation, the Turing Test claims that if you ask ten questions of an interlocuter (it may either be a machine or a human on the other end of the line), and if you think from the ten answers you receive that the interlocuter is a human, then the interlocuter–even if it was a machine–has human intelligence. Turing often received the response: “so you’re saying a machine could be as intelligent as a human?” Turing’s frustrated response was that, no, that machine IS human, because humans are whatever we recognize as one of us. In other words, if a machine can have a human conversation with you, and you cannot distinguish its human speech from “human speech”, then it belongs in whatever this category is that we call “the human”.

And maybe this all makes a bit more sense when we consider that Turing was prosecuted by the British government for being gay. His sexuality marked him as somehow less than (criminally less than) a normal human. This despite his famous, tide-turning contributions to the war effort; this despite his brilliant contributions to math and philosophy. Today we would say, “of course he was human, he was absolutely no less human than anyone else,” but the point is that, despite however natural we think it is to be human, we somehow find ourselves constantly denying human status to so many people around us.

My friend is gone, but there are still so many traces of him (his “hello world” wave, for instance) that survive in the world. Are they less human–is he less human–because those traces of him are digital? artificial? technological? When a friend writes you a postcard, and you read their message days later, is the experience less human because of the technology of writing it relies on? because of the traversed distance that technology makes possible?

I don’t think so. And I think we should remember that each paltry, silly “hello world” we see in the sphere of programming is more than just a string–it’s the proof in the pudding that technology as a whole is human at its heart.

--

--

Andrew A. Johnson

I’m a full-stack web developer experienced with JS (React / Redux) & Ruby (Rails / Sinatra).