Intranets as failed utopias: Knowledge management and human nature
In a twist midway through Philip K Dick’s characteristically paranoid 1968 novel Do Androids Dream of Electric Sheep, the android-hunter Deckard finds himself being accused of being a synthetic. “This matter should be straightened out.” he said, as he was led away to the Hall of Justice. But no. Instead he is taken to a completely different Hall of Justice in a totally different part of town, with its own cops, detectives and staff that Deckard had never heard of. The one he operated from had been derelict for years, he was told. As he ponders his predicament, our hero muses how it is possible for such an organisations to be here and nobody even know about it. Was this new department a fake run by androids? Or was he actually an android?
Like much of Dick’s hi-tech surrealism, like Ubik’s forewarning of the Internet of Things or Minority Report’s Pre-Crime, his ideas have an way of manifesting themselves in the real world. And while police departments run by androids may not yet be a thing (though we’ll see what Mr Musk has up his sleeve), Deckard’s feelings of fear and awed befuddlement did remind me of one thing. Working on Intranets.
Hear me out.
Around the time Do Androids Dream of Electric Sheep made it to the screen as Blade Runner in the early 1980s, the rapidly computerising corporate world was starting to face up to the challenges of globalisation and the difficulties of sharing useful information. Hewlett-Packard CEO Lew Platt famously grumbled that, “If only HP knew what HP knew, we’d be three times more productive.” Almost two decades later the pithy phrase found its way into the title of George Sieloff’s influential 1999 research paper about the evolution of HP’s Knowledge Management strategy, from its first unwelcome inroads into the laid back face-to-face culture of the late 70s all the way up to a global corporation at the turn of the millennium.
“Even at a local level” he said “the formalization of knowledge capture and sharing would often be resisted as a form of creeping bureaucratization…. units that fostered entrepreneurial spirit and intense business focus also fostered a myopic disregard for the benefits of more global sharing.” Any attempts at formalising the process of capturing and sharing knowledge were perceived as “corporate meddling”, something anyone who has worked in the lower echelons of a company can surely sympathise with.
Throughout the 80s and 90s they went through a series of changes aimed at harvesting, storing and sharing in a way useful knowledge could be made available to the right people at the right time. But in 1995 — just as they were on the verge of getting it right — revolutionary new innovations like the “electronic mail” and the “web” “browser” started to appear. And almost overnight, and the signal to noise ratio began to shift again sharply in the direction of noise.
“At first the experience was exhilarating” said Sieloff, “but it soon became overwhelming to many users, as the volume of available information far outstripped anyone’s ability to keep up… information overload went from a theoretical concept to a visceral everyday reality for many knowledge workers.” The report summed up saying that “The World Wide Web triggered an explosion in the availability of information and knowledge, but did nothing to expand our attention capacity.”
In this paper, he outlined the genesis of his “knowledge management” strategy. Systems to identify, store and share useful information and tune the company’s nervous systems back towards identifying signals. To Sieloff, clarity of communication and effective identification and distribution of knowledge was clearly going to be an important competitive advantage going forward. He wrote of platforms and tools to “evaluate the importance of topics across broad topic spaces” along with cultivating “learning communities” and new tools to give alerts to employees on topics they were interested in.
This was almost 20 years ago. Today the situation has moved on once more and in many places the signal never weaker. In writing such articles in the past I used to pause here to reel off the latest stats to explain how many videos or tweets or emails or annual reports are published a per nanosecond by way of illustrating the quite mind-bending scale of the current information explosion. Now I have learned that such astronomical numbers are practically quaint by the time I click publish. Suffice to say that the information explosion will continue to grow exponentially, both inside and outside of organisations, and we are in dire need of useful tools to identify what is useful and what is not. But the frenetic need for them, we risk simply compounding the problem by misunderstanding what these information systems actually *are* and being honest about how they are built.
Shapes within the forest. Forms within the gloom
We still tend to talk of IT systems as mechanical things, gleaming scientific constructs made by “engineers” and “architects”. Rather, I think much more organic metaphors are called for, being as the are the electronically encoded phenotype of collective human endeavour. Other than the more rigorous disciplines such as data science (though they suffer from the same frailties and cognitive biases as us lesser humans) the average person involved in the construction and maintenance of intranet and are scientists and engineers the way a beaver is a fluid physicist. As such their construction, implementation and usage tends to echo the foibles and failures of the human condition. Reacting both shifts in the information environment described above and the Shakespearean dramas of internal politics, information systems end up being implemented in uneven undulations and camouflaged in corporate newspeak.
The outcome is that IT systems and intranets grown atop one another like layers of a brain, with Cambrian structures built in archaic technologies like Mainframe augmented and integrated with newer and newer programs and functions in complex tangles that few individuals truly grasp. Rather than being gleaming and precise megastructures they instead resemble ecosystems, built around the rubble of many failed projects, augmented with quick fixes, and workarounds going back maybe decades. It has the cumulative effect of becoming a kind of collective brain-fog. A endless gloom of information where one can scarcely make out the forms within. It is in this expanse where almost identical tools, resources and projects can be created and exist parallel and in blissful ignorance of one another, or at best known only as folk-tales of far-off lands.
To give an example. I recently conducted work for a large bank we looked to design one page on one corner of their intranet that was to become a definitive portal for DevOps resources within the organisation. In researching the content of the page, my counterpart at the bank has spent weeks seeking out over 30 tools are resources that significantly overlapped, including some rarities completely unknown in his silo of the organisation, including other incomplete definitive lists of DevOps resources. As the brief project wound to a close, he spoke of going out on a new expedition, convinced there were more out there; as if venturing into old-growth electronic-jungle for lost cities of knowledge. This is after only a few decades of electronic systems. What might ie be like in a few more. In a century? How tempting it is to tear it all down, and start again.
The dangers of deforestation.
Just as we thoughtlessly tear down rainforests and replace thriving ecosystems with biologically impoverished monocultures, so a slash and burn approach to intranet systems without really understanding what’s going on within them is also a recipe for disaster. Having worked on a fair number of intranet systems over the last 15 or so years, I’ve seen a curiously recurring pattern repeat where intranets grow within another like nested dolls. Not older systems, but newer systems that have thrived amidst the nourishing sunlight of our electronic forest floor; attention.
Often these are built by employees who have “gone rouge” and carved out their own fiefdoms with rudimentary HTML skills or a slyly acquired admin privilege. It can be easy for designers and UX architects to tut-tut at such things — amateurishly implemented and off-brand as they inevitably are — but really they are testimonies to human ingenuity in working around obstacles such as lack of budgets or management willpower, or simply the accumulated ruins of many failed projects.
Often these can be amongst the most popular parts of the system, for they are enthusiastically maintained and genuinely useful. And they have grown organically precisely because they deserve that scarce resource of attention. In our eagerness to start over — to the extent that is even technically possible — we need to first engage in some cartography of what exists and how it is presently used, rather than just impose our will on disgruntled staff, and engage in some 21st century style, souped-up, supercharged corporate meddling. In short, we need to understand how people use what is there before we look to replace it.
The US Army Training and Doctrine Command (TRADOC) was a hi-tech Knowledge Management System to house training and development materials for the military, and was built in conjunction with IBM over a two year period. The only problem is that the users of the system were not really consulted, and it ended up as an “overpowered and under-utilized, overly complex and cumbersome” according to knowledge management expert Dean Call. Not long after the megasystem launched it had just 1000 registered users, and only 30 active ones. User according to Call users were “bullied” into ways of working by the system; “expected to change to fit the technology rather than having the technology change to fit them”.
This sort of single-minded implementation — likely reflecting some kind of political struggles in the upper echelons of the company — puts one in mind of mid 20th century modernist city-planners whose vanity and megalomania laid waste to so many historic city centres. (In London, beloved medieval lanes on the Strand, including a former residence of Peter the Great were replaced with the grotesque, rectangular Arundel Great Court). Inspired by faddish notions of city planning and architecture and faulty understanding of human behaviour, eyesores were thrown up that magnified human miseries rather than alleviating them. If we were to extend this metaphor to intranet systems, we might draw the following parallels to these two poles.
A badly designed intranet is an assault course that makes it more difficult for employees to do their jobs. Its underlying philosophy is that vast megaprojects dreamed up by technocrats and forcefully implemented for largely politically motivated solutions to the very real problems already discussed. They are relics of the 19th and 20th century command-and-control mindset and can magnify the worst vices of management such as cliquism, bandwagoneering and being out of touch with those on the coalface. As one fails after the other, new, more ambitious schemes are concocted to fix the problem, and the cosmic cycle begins anew, compounding the problems to the point that multiple megaprojects can occur in tandem. Indeed, the failure rate of “knowledge management” projects led the very term of fall out out of favour due to its association with IT disasters.
Bad intranets characteristically;
- Are the compound effect of years, sometimes decades of accumulated ruined master plans based on no evidence, and subsequent panic-fixes
- Massively increase time it takes to solve problems
- Have poor architecture, becomes reliant on top-down taxonomies and unusable, ostentatious nomenclature
- Adds error and confusion in solving business problems
- Has enormous redundancy in information, both spatially (repeated pages and resources) and temporally (multiple versions of files)
- Has quality information lost amidst a sea of bad information
- Are rigid and costly to make large-scale changes, easy to add more noise
- Are siloed, hostile communities with misaligned or competing goals
- Causes mass demoralisation
A well-designed and structured intranet can not just aid communication within an organisation, but can make it run more effectively thanks to improved workflows, and an increased capacity for problem solving. Bill Gates once said that knowledge management is “nothing more than than managing information flow; getting the right information to the people who need it so they can act on it quickly.” As such, good intranets are user centric, both aligned to business objectives and are ever evolving and improving. Their underlying philosophy is that of channeling human behaviour by recognising its organic nature and as something to be cultivated.
A good Intranet;
- Evolves based on listening to users and creating iterative refinements and changes based on evidence
- Reduces the time taken to locate information
- Has a clear architecture and terminology understandable by everyone
- Help you solve business problems more effectively
- Reduces or eliminate redundancy of information and resources
- Retains and elevate quality information naturally, reduce and remove bad information
- Is flexible and adaptable at a large scale
- Has thriving communities that exchange useful information with one another
- Makes people feel good about their job
To conclude, the information age continues to unfold and the challenges of identifying, retaining and distributing information will continue to become not just a matter of competitive advantage, but a matter of survival. Building intranets and knowledge management tools in a manner that forces people into adopting unnatural behaviour and workflows rather than channeling behaviour in a way conducive to organizational goals, they will be doomed to compound and prolong these fundamental problems, regardless of the amount of employee comms that are thrown at them.