PinnedRichard LiawinTowards Data ScienceHow to tune Pytorch Lightning hyperparametersUse Ray Tune to optimize Pytorch Lightning hyperparameters in 30 lines of code!Aug 18, 20203Aug 18, 20203
Richard LiawinDistributed Computing with RayConfiguring and Scaling ML with Hydra + RayLaunch your Hydra applications on the cloud with the new Hydra-Ray integration!Jan 26, 20211Jan 26, 20211
Richard LiawinDistributed Computing with RayRay for WindowsRay supports Windows natively as of June 2020! Try Ray on Windows via `pip install -U ray`.Sep 26, 2020Sep 26, 2020
Richard LiawinDistributed Computing with RayFaster and Cheaper Pytorch with RaySGDDistributed training is annoying to set up and expensive to run. Here’s a library to make distributed Pytorch training simple and cheap.Apr 7, 20206Apr 7, 20206
Richard LiawinriselabScaling Experiments at Berkeley AI ResearchAs AI research becomes more compute heavy, AI researchers become resource constrained. How do Berkeley AI researchers remain productive?Jan 21, 2020Jan 21, 2020
Richard LiawinriselabCutting edge hyperparameter tuning with Ray TuneIntroducing Ray Tune, the state-of-the-art hyperparameter tuning library for researchers and developers to use at any scale.Aug 20, 20191Aug 20, 20191
Richard LiawinTowards Data ScienceRay Tune: a Python library for fast hyperparameter tuning at any scaleHow do you tune hyperparameters with thousands of cores in just 18 lines of code?Aug 18, 20195Aug 18, 20195