Technology Will Reduce Everyone’s Busy Work

Artificial Intelligence Will Lead To Productivity Gains — For Better Or Worse

Max Nussbaumer
Zentyment
8 min readFeb 2, 2023

--

In one of my first job interviews, someone who would become my boss a few weeks later asked the unanswerable question — giving me a chance to embarrass myself: How do you convince a CIO to buy computers from you, given that we are dealing with an IT productivity paradox[i].

In a nutshell, the paradox states that between 1970 and 1990 (when I had my interview) there was a ton of spending on computers, but there were no corresponding productivity gains. This led to some serious head scratching at the time, and possible explanations were:

  • Measurement errors: incorrect output and inputs measurements, also many other factors that affect outputs
  • Lags: maybe the true benefits would occur after my interview, and we may have seen some of that in the late 90s. Who knows how much of the high productivity gains in the late 90s were due to IT? The IT productivity paradox has become pretty moot since then.
  • Mismanagement and redistribution: IT may not be productivity enhancing on the economy- or firm wide level, but only in specific areas.

In hindsight, I would like to add another explanation. IT wasn’t very good back then, given how much I struggled with the simplest tasks like creating letters, spreadsheets and getting stuff printed. Creating presentations for my boss involved sticking cut-outs on pieces of paper and running the pages through a copy machine to print foils for an overhead projector. Predictably, minutes before the board meeting, my slides were a melted mess in the copy machine.

Productivity Growth Has Been Weak As Of Late

Productivity, that magic measurement of success, is defined as output divided by input. To facilitate output measurement, we often resort to accounting numbers, such as GDP or revenue. Otherwise, we would have to count all cars, cans, bushels, barrels, bottles, bicycles and whatever gets churned out by our industries. People do that too, of course. Inputs are more complicated, as we have moved from predominantly labor input to machinery, computers, nutrients and the like, because we are not living in a 19th century farm world anymore.

And speaking of it — agriculture is a stunning example of productivity gains, century after century. Since 1948, agricultural output per input (land, labor) has tripled. 83% of the US labor force of 1.9mn (28% of them literally slaving away) worked in farming in 1800, today it’s 1.3% working in direct on-farm jobs (of 160mn in the labor force). At the same time, we are farming way more land, and worries about catastrophically insufficient food supplies (i.e. Robert Malthus in the 19th century)[ii] have been replaced by worries about obesity — at least in the United States; other continents are facing very different issues.

Everyone from the OECD to the Federal Reserve to the average CEO, is monitoring productivity as a key metric of success and progress. The word rings a frightening bell to the average worker, as productivity improvement usually forebodes job losses across the board. General Electric, where I once worked, had a rule of cutting 10% of jobs per given amount of revenue every year, with questionable long-term results.

A lack of productivity gains in recent years has been a constant worry for policy makers. We used to have improvements of 2.1% per year in the period of 1947 to 2018, and we really kicked into gear from 1998 to 2005, achieving 3.3% per year. Now we have become a little sluggish, hovering around 1% max. every year.

What’s to worry about this? Well, by common reasoning, an economy will only grow with its productivity and/or population growth. Since we are not doing too well on the latter — for whatever reason[iii] — it seems problematic if we can’t eke out more widgets per hour. Why do we have to keep growing? That’s a difficult question and any answer will veer into philosophical terrain, so I will resist.

Productivity In The White-Collar Economy

According to Wikipedia, a white-collar worker is a person who performs professional, desk, managerial, or administrative work. White-collar work may be performed in an office or other administrative setting. White-collar workers include job paths related to government, consulting, academia, accountancy, business and executive management, customer support, design, engineering, market research, finance, human resources, operations research, marketing, public relations, information technology, networking, law, healthcare, architecture, and research and development. And I was surprised to learn that Upton Sinclair[iv] coined the term.

I have been a white-collar worker all my life, for lack of better skills. But outside weddings, funerals and black tie events, I don’t wear white collars anymore, because they look awkward and they turn brown quickly. According to the department for professional employees, there are 89mn of such workers, or 60% of the labor force. We know that all those people do something and that it involves the production of many documents, making phone calls, reading and sending emails, and sitting in an awful lot of never-ending meetings. But we don’t know how productive many of those people are, and how often they are doing something valuable (nicely called etwas schaffen in German). But we know that we are very good at keeping each other busy all day long.

Within this group, there are many individuals whose productivity gets measured in real time. Back-office workers in customer service desks, insurance administration, bank account administration, order- or invoice processing, freight-forwarding and the like. Any of those areas is managed via hundreds of metrics, such as number of account openings per bank clerk per week or percentage of customer tickets per service desk representative resolved within 5 minutes etc. The purpose of those metrics is to keep a tight lid on staffing and take out any excess capacity the moment it shows up in dashboards. Companies have a pretty good idea of what productivity should be in these areas, workers don’t make too much money, and most of the ones I meet are surprisingly happy with their jobs.

It gets a little more abstract when we look at managerial layers, and this doesn’t only refer to an elite circle of top managers. A search on LinkedIn for the fancy title of transformation manager delivers thousands of hits for any big global company. In the last 2–3 years, we have seen a proliferation of promotions to positions with pompous titles. In the US some people suspect that those promotions are just a way to avoid having to pay for overtime — “director of first impressions” and “lead shower door installer” are conspicuous candidates.

The main way to measure these people’s productivity is by correlating their number or hours at work with revenues or some other accounting number, without any provable causality. Of course, many of the people are not just well educated and well paid, they also have a shared interest in protecting their jobs from getting scrutinized for potential productivity gains. Who knows what would happen if we take out the entire 17th floor? Or maybe take out all 12 floors, like Carl Icahn joyfully did at one of his companies. Typically, short-term nothing happens, long term — who knows.

Amy Hwang (FT/Cartoonstock)

If the promise (or threat) of AI materializes, we are entering a future of vast productivity gains for the large community of white-collar workers. ChatGPT and its cousins will write our meeting minutes, summarize a project status, deliver the monthly financial reports, all without typos or spreadsheet errors. This generation of AI tools may propel AI forward in the same way the world-wide-web made the Internet accessible to everyone (which was already 10 years old when the WWW came on stage).

About this, my esteemed colleague Bailey Blankenship at Zentyment asked the obvious and fair question: What are people going to do with the free time that technological progress will give them? In a similar way, Keynes foresaw this development as early as 1930[v]:

Back in 1930, Keynes predicted that the working week would be drastically cut, to perhaps 15 hours a week, with people choosing to have far more leisure as their material needs were satisfied. The world was then gripped by a dreadful slump but in the long run Keynes was sure mankind was solving its economic problems. Within a hundred years, Keynes predicted, living standards in “progressive countries” would be between four and eight times higher and this would leave people far more time to enjoy the good things in life[vi].

The answer to this depends on someone’s viewpoint and whatever people and their employers’ decide to do with the spare time. Some say that Keynes was entirely wrong, given how much overwork we are seeing, incl. burnouts and a universal scream for better work life balance. In his defense, Keynes could not have foreseen people producing 100-page PowerPoint decks of which barely the first page gets read, or people dealing with 2,000 unread emails in their inboxes. From a productivity standpoint, both are impressive examples of silly productivity or performative work, and I am sure Keynes would have had a clear opinion about this.

One of my life’s lessons is not to argue with Keynes, so I think his prediction will come true (give us 7 more years) in our economic zone — maybe a few years later in the UK, where things look a little bleak at the moment. As individuals, we can curtail our economic ambitions and use our free time wisely. And companies can harvest the productivity gains and resist the temptation to fill the time won with new busy work. Or we all decide to do just the opposite, it’s a free country.

Never argue with Keynes (Pictorial Press/Alamy)

For comments, please contact me on max@zentyment.com

Footnotes

[i] Robert Solow, the Nobel Laureate economist, has aptly characterized the results: “we see computers everywhere except in the productivity statistics.” (http://ccs.mit.edu/papers/CCSWP130/ccswp130.html)

[ii] The relationship between food production and food supply was first expressed by an English Economist called Thomas Robert Malthus (1798 -1823). Malthus stated that population increased in a geometric progression (ie., 2, 4, 16, 132…) while food production increased in arithmetic progression (ie., 2, 4, 6, 8…). Thus population grew faster than food production and tended to outstrip it in a short time. He wrote, unless humans can limit reproduction voluntarily through self-restraint, population would be reduced by catastrophic events such as diseases, starvation, misery and wars. (https://web.ccsu.edu/faculty/kyem/GEOG473/5thWeek/Food%20production%20and%20GM%20foods.htm)

[iii] By some estimates male sperm has declined by up to 50% over the last 50 years. I like the theory that this is related to PET water bottles, but there exists only somewhat ambiguous scientific evidence (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7967748/)

[iv] Famous for his observations of the appalling and unsanitary conditions in Chicago’s meatpacking factories, leading to the Meat Inspection Act and the Pure Food and Drug Act. “The Jungle”, 1906

[v] John Maynard Keynes, “Economic Possibilities for our Grandchildren), 1930

[vi] A summary by the Guardian, as Keynes original writings are quite tedious reading material — “Economics: Whatever happened to Keynes’ 15-hour working week?”, Aug 31, 2008

--

--

Max Nussbaumer
Zentyment

Entrepreneur and investor in interesting ideas. Developer of startups that are successful more often than not.