Introducing the Initial Release of PyTorch Lightning for Graphcore IPUs
PyTorch Lightning now supports Graphcore IPUs
We are thrilled to announce PyTorch Lightning now supports Graphcore IPUs. The team at PyTorch Lightning has been working heroically on building IPU integration over the last few months and is now making this available to the community with its 1.4 release. We really appreciate their close collaboration with the Graphcore team to help us with our mission to make IPUs easier to use for developers.
At Graphcore, we are hugely supportive of PyTorch Lightning’s mission to meet the growing demands from the AI research community for flexible and fast AI compute solutions.
PyTorch Lightning liberates data scientists and deep learning practitioners from heavy engineering duty (data distribution, loops management, logging handling and much more) and allows them to focus on modelling and data understanding, in other words to focus more time on research.
This new integration means PyTorch Lightning users can now take any PyTorch models for the IPU which use our PopTorch API and run them with minimal code changes and get the same high performance.
PopTorch is a set of extensions for PyTorch which enables PyTorch models to run directly on IPU hardware and is designed to require as few code changes as possible in order to run on the IPU.
Why we love PyTorch Lightning
We love PyTorch Lightning for the same reason AI researchers do: It’s a simple wrapper which removes complexity around training loops for PyTorch models to abstract the underlying platform so the user experience for IPUs is much closer to other platforms.
It removes a lot of boiler-plate code which leads to a cleaner, easier to use implementation.
PyTorch Lightning requires the user to specify a model function, just as we do with our PopTorch configuration of PyTorch models, so it’s a familiar user experience.
It doesn’t interfere with IPU-specific optimizations and decompositions so PyTorch Lightning models on the IPU get similar high performance to standard PyTorch models for IPU.
How to get started
Install PyTorch Lightning: https://github.com/PyTorchLightning/pytorch-lightning#step-0-install
By default, PyTorch Lightning will install the latest version of PyTorch. To ensure that the version of PyTorch supported by PopTorch is installed, you should either use pip3 install — no-dependencies when installing PyTorch Lightning or install the supported version of PyTorch afterwards.
pip3 install pytorch-lightningpip3 uninstall torchpip3 install torch==1.7.1+cpu -f
Install the Poplar SDK as described in the relevant “Getting Started” guide for your IPU system on the Graphcore documentation portal.
PopTorch comes packaged with the Poplar SDK as an installable wheel file. The full instructions for validating that it has been installed correctly can be found in the PopTorch user guide.
PopTorch currently uses PyTorch version 1.7.1 and we will increment this over future releases. Some packages may install newer versions of PyTorch and you may have to pip install the supported version. See the Version Compatibility section of the user guide install step.
pip >= 18.1 is required for PopTorch dependencies to be installed properly.
Running a basic example
The following code example shows how to run a training model using a simple MNIST example.
How to Get Started on IPUs
University researchers can apply to Graphcore’s Academic Programme for the opportunity to access IPUs. The programme is designed to support academics conducting and publishing research using IPUs or in their coursework or teaching. Researchers selected to participate will benefit from free access to Graphcore’s IPU compute platform in the cloud, as well as software tools and support.
To learn more about how to run a PyTorch Lightning model on the IPU with a single line of code, read our latest developer tutorial walkthrough blog.
Resources and Links