Anyone that has managed GPU-based workloads in the cloud knows that those bills can mount quickly. One alternative is to continue running large systems on premise, which invites its own budgeting, technical and logistical complications that the cloud was supposed to alleviate in the first place. In this article I propose the possibility of a middle ground. It still requires a bit of patience, but at least you won’t have to manage a private data center — or pry open a computer case or dive into BIOS settings to have a system with serious smarts under the hood.

Not an actual talking laptop. If it could talk it would express disappointment in time spent on social media.
Disclaimer: I’m…

Houston H. Haynes

I build smart teams & smart things — auto, entertainment, audio & medical. Azure, AWS, rstats, .NET, C#/F#, iot, ci/cd, devsecops

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store