Cerebras: Man’s Blurst Friend
~ Human-Brain SIZED artificial intelligence this Christmas! ~
TL;DR — Cerebras, the company which recently began selling their ultra-massive computer chips, specialized for training A.I., just put a stack of 13.5 million AI-core beasts together, for an EXA-scale supercomputer that costs $30 million bucks. While the human brain has only 80 to 100 Trillion synapses wiring it together, Cerebras’ wafer-scale chips each match our noggins; when those chips are a massive stack together, we should probably address it as Lord or something before it gets offended… Especially considering that the current A.I. models which are already performing better than us are using only 1/2,000th as many synapses as our brains! Cerebras’ Andromeda chip-cluster will be like growing an alien artifact. This is an important next stage in the Age of A.I… Say hello to your new friend, Cerebras:
Did you hear how 18 new drugs are already advancing toward human trials, which normally takes a decade, by using A.I. to speed the development process and tailor the drugs to our needs? Yup.
And, Google’s language A.I. that’s too godlike for public access, PaLM, had seemed to hit a limit in its abilities… until the engineers asked PaLM to break-down problems into steps, and prompt itself with each step, and feed those answers back in as a step to check them… and, by golly! It’s even better than before! Only a few years ago, A.I. researchers hypothesized that “when we build a general intelligence, it’ll self-improve repeatedly, until it spirals well beyond our comprehension.” That’s called ‘FOOM’ (which is not an acronym, just the sound the A.I. makes as its brain goes all meme-galaxied). Well, PaLM is not a ‘general’ intelligence, and its brain is many many times smaller than ours, while it bests well over half the country in language tasks. It’s already FOOMing.
And Image Generation is an industry, now, with Billion-dollar companies… Stable Diffusion is already in Photoshop, for pete’s sake! A.I. is becoming a market, not just a discipline, fast. Remember cell phones? Those began in the late 70s in Japan. It wasn’t until the 90’s that everyone had a clunker. 2007 was iPhones, if you could afford to bathe in gold. Cellphones took decades. A.I. image generation from text-promts was revealed less than two years ago, and it’s already in Photoshop. Give that a pause, to sink-in.
So… Cerebras? Computer chips the size of soccer-team pizzas? Able to train neural networks that are literally one hundred times larger than the largest network from two and a half years ago? Moore’s Law would have taken about 13 years to get 100x… we are rocketing so fast, our blurred tears and shredded clothes become meteorite, space dust! Imagine, if the progress in desktop computing that happened between 1990 and 2003 had instead lept into being in late 1992? That’s the insane speed of acceleration we are maintaining. Cerebras is roaring into a new frontier, were brain-sized machines are a line-item comparable in cost to the annual corporate dinner expenses. Two years ago, we were astounded to see a single A.I. that was 1/500th a human brain’s synapses — now, Cerebras stands well beyond each of us.
Foom.
Does this mean… doom? No, at least not the ‘robot-overlord’ variety. These ‘brains’ can only learn a few different tasks in a row, before they begin to forget the old ones. So, if you want them to learn a lot of different stuff, you have to teach the brain all the tasks at the same time, no updating later. We haven’t figured out a ‘brain that can keep learning completely on its own.’ Yet.
DARPA is working on that kind of a ‘life-long learner’, and so is the Alberta Plan. And I have a feeling they’ll both be buying a certain snarling, toothy, underworld brain-hound, to feed it their crumbly data-biscuits. Cerberus, a ferocious lock on the Door of Dead People. Cerebras, at the Door of Living Machines… opening.