The Intelligent Machine World
Its foundational elements, its catalysts, general challenges, growth and societal implications.
Access PDF and / or read below
A podcast series, Babbage by the Economist, retraced one’s mind back to an episode the publication did on foundation models — the basis of syper-hyped platforms, where reinforced unsupervised learning statistical networks play hard and re-hash the Web’s available information to provide concise, quirky, parody, elaborate, amusing — all manners of text written by men across time: if these were digitised and recorded.
Its “sister” — stable diffusion algorithm - does the same for graphics, imagining and reimagining objects in a particular set-up, artistic vision or unreal setting. One of the best overviews on the tech for a layman can be found here.
All these latest (massive) advancements with what these networks can present have been laboriously crafted by privately funded corporations — having access to huge data-sets and building statistically significant connections (akin to synapses) via CPU and GPU-enabled parallel processing.
OpenAI Presents GPT-3, a 175 Billion Parameters Language Model | NVIDIA Technical Blog
OpenAI researchers recently released a paper describing the development of GPT-3, a state-of-the-art language model…
A disruption moment in time it is, isn’t it?
One can look at this from numerous perspectives. The intensity that we experience change, or rather that change is presented to us — makes us forget of near similar disruption / shift moments that happened a decade / 2 decades / 3/4/5 decades ago. Some would say that this time it’s different — but the same bold belief was numerous times uttered — faithfully comitting to a new tool that disrupted business models, dethroned incumbents and promised eternal prosperity — or by others it emerged as an aposit omen of destitute society — robbed of honest work by senseless automatons.
Was there a near similar computer-powered scalable procedural aid — that was manifested to disrupt how things were — and how has human society integrated it? At least one recent was — and for some it did pure magic to be featured in an ad of a company that was the epytome of magic: Apple.
The tool predicted to change the world was computer spreadsheet software.
A Spreadsheet Way of Knowledge | Backchannel
A generation ago, a tool unleashed the power of business modeling-and created an entrepreneurial boom I learned…
Episode 606: Spreadsheets!
subscribe to Planet Money podcast subscribe to Planet Money podcast Note: This episode originally ran in . Spreadsheets…
An article wrote in 1984 by Steven Levy, the very same legendary Wired journalist and publisher of numerous books on cryptography, Silicon Valley hackers, a beautiful biography of the iPod, — predicted times new for many, ruminating of the nature of human behaviour can be now calculated by spreadsheet software that upends management practices.
Other ruefully commented if it would leave accountants, originally doing calculations on big paper spreadsheets, out of their jobs.
According to a commentary on a brilliant retelling of the spreadsheet story by geek hosts of NPR’s Planet Money:
“since 1980, right around the time the electronic spreadsheet came out, 400,000 bookkeeping and accounting clerk jobs have gone away. But 600,000 accounting jobs have been added.”
It’s not that spreadsheet software pushed people out of their jobs, but it empowered those willing to learn how to use it — to do more complex things.
“It is not far-fetched to imagine that the introduction of the electronic spreadsheet will have an effect like that brought about by the development during the Renaissance of double-entry bookkeeping.”
“The spreadsheet in that comparison is like the transcontinental railroad. It accelerated the movement, made it possible, and changed the course of the nation.”
Kapor’s comparison is an apt one. The computer spreadsheet, like the transcontinental railroad, is more than a means to an end. The spreadsheet embodies, embraces, that end, and ultimately serves to reinforce it. As Marshall McLuhan observed, “We shape our tools and thereafter our tools shape us.”
The what-if factor has not only changed the nature of jobs such as accounting; it has altered once rigid organizational structures. Junior analysts, without benefit of secretaries or support from data processing departments, can work up 50-page reports, complete with graphs and charts, advocating a complicated course of action for a client. And senior executives who take the time to learn how to use spreadsheets are no longer forced to rely on their subordinates for information.
The same might be the case for foundation models: they can empower — but one shall not put all faith in the model infallibility: not having a critical thinking and unable to summarise on its own (rather, trying to find reliable summary of available content it is tasked to condense) — it can exaggerate, extrapolate, infer incorrectly and borrow from embedded historical bigotry, racism etc:
The risks are considerable, especially that the algorithms are being developed by loosely supervised mega-monopolies, who are controlling other elements of the stack of the intelligent machine world: data origination and hosting and hardware they custom-design and use at scale.
The evolution of interpretation logic goes in step with evolution of hardware: both reinforce one another. Just as we witness the birth of AI models that surpass humans in basic tests — that would not have been possible without massive advancements of chip design and manufacturing.
Yet the standard design — born to scale production of general purpose computing by IBM that agreed to outsource the building of the CPUs to Intel — and OS to Microsoft — as per Ben Thompson writing, with each revolution comes dearly in terms of output per dollar.
Shall things continue as they are, the next generation of chips without changing the approach would require CAPEX equal to the current yearly industry revenue — and diminishing number of producers are able to foot the bill.
And modern algorithms require all the power — and reference data they can get.
These models can assist increasing number of tasks that labour across the globalised supply chain network is engaged in.
To follow a scenario, re-shoring leads to job losses across manufacturing hubs close to geopolitical heated areas, where originally jobs were placed solely because it was favoured by shareholders capitalism (cut costs via outsourcing vs. automation).
Emerging markets that lose low-skill jobs due to re-shoring — can respond by up-skilling and enabling not just low-cost advantage — but efficiency of output — combining AI and labour: so proactive investment in AI can assist in protecting “normal” rates of employment of hundreds of millions predominantly young population.
The future contours are opaque at best. Shall one subscribe to Carlota Perez or Joel Mokyr estimates — that we are yet to see integration of technologies that were seeded in the 70ies and 80ies — one still has to understand the trends that drive the hardware and software elements of manufacturing rebuild — and the societal pillars that put them at scale.