PyTorch
Published in

PyTorch

PyTorch Lightning 0.7.1 Release and Venture Funding

Source

Venture Funding

PyTorch Lightning Github Repo

Thank you!

As we step into this new stage in the Lightning journey, I want to take the time to thank my research advisors Kyunghyun Cho and Yann LeCun for being super supportive and allowing me to focus a significant portion of my time on developing and improving Lightning.

What about open-source?

This is great news for the future of Lightning as an open-source project. The contributors and I have been maintaining the package on a part-time basis, but it has quickly become hard to keep up with the number of community requests without a full-time team. The funding will give us the ability to hire additional engineers and move through features faster — all while always remaining open-source.

Contributor Types

As we expand the team, we want to highlight the different types of contributors roles for Lightning.

Investors

We’re lucky to have partnered with awesome investors that align with our values of continuing to grow an awesome community around Lightning and to develop best in class research tools for them.

TPU Support

Perhaps one of the biggest features of this release is the support for TPUs. Without having to change your code you can now run the same Lightning code on CPU, GPU and TPU.

Profiling

How many times have you wondered where in your code you may have lurking bottlenecks? Maybe loading data is slow? or is it the training step? Our core contributor Jeremy Jordan, added an amazing profiler to the framework.

  1. set the profiler= True.

Callbacks

Although Lightning itself is just a sequence of structured callbacks, the community wanted a better way to encompass non-essential logic needed for training.

  1. Research code (Lightning Module)
  2. Engineering code (Trainer)
  3. Non-essential research code (Callbacks)

Fit with DataLoaders

For some production cases, it’s useful to NOT define the dataloaders inside the LightningModule. Now Lightning supports passing in those callbacks to the .fit function.

New Loggers

Lightning doesn’t just support tensorboard. In this release we added support for more loggers. These loggers enable users to have different experiment tracking and visualizing capabilities.

Multiple loggers at once

Now you can also use multiple loggers at once

Thorough Documentation

The last release was a transition release for us in terms of documentation. Now, we’ve fully documented the codebase, with a ton of examples and mini-tutorials on how to use each feature.

Torchbearer Merger

During this release we also had the pleasure of welcoming the founders of Torchbearer (Ethan Harris, and Matthew Painter) to the Lightning core team where they’re bringing their experience developing DL frameworks to make Lightning even more powerful and flexible.

How to get involved

The Lightning community has grown tremendously over the last months. Feel free to submit PRs when a feature isn’t supported or you have an idea for something new!

--

--

An open source machine learning framework that accelerates the path from research prototyping to production deployment

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
William Falcon

⚡️PyTorch Lightning Creator • PhD Student, AI (NYU, Facebook AI research).