AI Top-of-Mind Nov 14

dave ginsburg
AI.society
Published in
2 min readNov 14, 2023

Top-of-mind is Nvidia formally announcing their highly anticipated HGX H200 based on the ‘Hopper’ architecture. From the release:

“With HBM3e, the NVIDIA H200 delivers 141GB of memory at 4.8 terabytes per second, nearly double the capacity and 2.4x more bandwidth compared with its predecessor, the NVIDIA A100.”

It is to be coupled with the GH200, interconnecting the H200 and the Arm-based Grace CPU using the company’s NVLink-C2C interlink. The company continues to push the envelope on performance benchmarks, this time for AI training. From the Engadget article:

On Wednesday, NVIDIA unveiled the newest iteration of its Eos supercomputer, one powered by more than 10,000 H100 Tensor Core GPUs and capable of training a 175 billion-parameter GPT-3 model on 1 billion tokens in under four minutes. That’s three times faster than the previous benchmark on the MLPerf AI industry standard, which NVIDIA set just six months ago.

What are executives thinking about AI? A recent EY report shows they are moving ahead (99%), even diverting funds from other projects (69%). But they are realistic as to when they will experience growth from these investments. More in the Forbes article.

A comprehensive AI glossary from ‘The Drum,’ with over eighty definitions.

And from ‘The Verge,’ a mashup between Boston Dynamics’ SPOT robot dog and ChatGPT. Check out the dialog, including Debbie Downer and a Millennial. My favorite line from the video — “The unfathomable void of my existence, much like this QR-filled board.”

Source: Boston Dynamics

--

--

dave ginsburg
AI.society

Lifelong technophile and author with background in networking, security, the cloud, IIoT, and AI. Father. Winemaker. Husband of @mariehattar.