Video: Delivery of a NVIDIA Tesla V100 to our Edge Data Center

ExaMesh
2 min readFeb 1, 2020

--

In our first video we want to show cinematically how we install a NVIDIA Tesla V100 32 GB in one of our Edge Data Centers in Germany.

In the operation of our Edge Data Center the GPU is part of our so-called AI Docker Instance. An Instance with dedicated Tesla V100 32 GB, 12 CPU cores, 200 GB RAM and 850 GBB SSD.

What makes our GPU Instance special?

The Instance uses our pre-built Docker base image with NVIDIA GPU drivers and AI frameworks.

Pre-built Docker base images?

Correct. After booting your Instance, you can immediately start using your AI application. Without the hassle of installing packages, compiling drivers etc.

The stack is ready to go: Docker + CUDA + PyTorch / TensorFlow.

Which Docker base images are available?

In the Open Beta we will offer one image supporting:
Docker + CUDA + PyTorch

We want to collect feedback from users during the open beta and then decide which base images will be provided in the future.

Update as of 20.02.2020: Open Beta started

More infos see “AI Docker Instance: Open Beta Started”

Help us grow

Create a free account, test our AI Docker Instance for free for 60 minutes and then give us feedback in your account dashboard.

Among all accounts that have given us feedback, we will raffle three 100€ Amazon vouchers :)

Thanks!

Many thanks to Maxi (video) and Manuel (photos) for their professional support.

Here are a few impressions from the video/photo shoot.

--

--

ExaMesh

The world’s first Instances that are powered by green electricity, and are housed in wind, solar and other green power plants.