GPU servers for machine learning startups: Cloud vs On-premise?
Michael Reibel Boesen

Thanks for the great article. I’ve got a question about RAM. I haven't had a chance to run GPUs based simulations, but from my experience and types of ML simulations I were running on CPU, I required 1GB per core. What would you say about the requirements fro GPUs ? I know it depends way too much on the data(is it numerical, multimedia, how much of it etc) as well as what the computational graph is itself, but I would still be very interested to hear about your thinking on this.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.