Microsoft and Google Open Sourced These Frameworks Based on Their Work Scaling Deep Learning Training

Google and Microsoft have recently released new frameworks for distributed deep learning training.

Jesus Rodriguez
DataSeries

--

Source: https://neurohive.io/en/news/google-introduced-gpipe-new-library-for-efficiently-training-large-scale-neural-networks/

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Microsoft and Google have been actively working on new models for training deep neural networks. The result of that work has been the release of two new frameworks: Microsoft’s PipeDream and Google’s GPipe that follow similar principles to scale the training of deep learning models. Both projects have been detailed in respective research papers(PipeDream, GPipe) which I would try to summarize today.

Training is one of those areas of the lifecycle of deep learning programs that we don’t think of as…

--

--

Jesus Rodriguez
DataSeries

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...