Machine Learning Weekly Review №8 — source of latest credible papers, videos and projects on machine learning for scientists and engineers.

Recommended this week


  1. Vega-Lite 2.0 — high-level language for rapidly creating interactive visualizations. Built by UW Interactive Data Lab.
  2. Tensorboard for PyTorch. Includes scalar, image, histogram, audio, text, graph and embedding.
  3. PyTorch-madrl — PyTorch impl of single and multi-agent RL algorithms (i.e A2C, ACKTR, DQN, DDPG, PPO)
  4. Colaboratory — interactive coding in your browser by Google Developers. More feature rich alternative to
  5. Gluon — imperative deep learning API from AWS and Microsoft based on MXNet.
  6. GPyOpt — Python library for Bayesian Optimization from Sheffield University.
  7. CoreMLTools — Protobuf serialization library for sklearn/caffe/keras/xgboost from Apple.
  8. CuPy — NumPy-compatible matrix library accelerated by CUDA.
  9. LM-LSTM-CRF — High-performance State-of-the-art Character-aware Sequence Labeling in PyTorch.
  10. Bambi – high-level Bayesian model-building interface in Python. Built by Tal Yarkoni on PyMC3.
  11. xgboostExplainer — R package that makes XGBoost as interpretable as single decision tree.

Research Papers

  1. “How deep learning works — The geometry of deep learning”. Finds its analogies with geometry of quantum computations and the diffeomorphic template matching.
  2. “Dynamic Routing Between Capsules”. CapsNet — new twist on neural network by Geoff Hinton that better at generalizing.
  3. “One pixel attack for fooling deep neural networks”. Affects 73.8% of test images with 98.7% confidence.
  4. “Progressive Growing of GANs for Improved Quality, Stability, and Variation”. NVIDIA AI generates CELEBA images at 1024x1024px.
  5. “Generative Adversarial Networks: An Overview”.
  6. “Understanding Generalization and Stochastic Gradient Descent”. Explores “generalization gap” & optimum batch size.
  7. “TFX: A TensorFlow-Based Production-Scale Machine Learning Platform”.
  8. “Rainbow: Combining Improvements in Deep Reinforcement Learning” by DeepMind.
  9. “Learning Diverse Skills via Maximum Entropy Deep RL” on multimodal learning in RL by UC Berkeley.
  10. “Unsupervised Machine Translation Using Monolingual Corpora Only”. Report 32.8 BLEU score without any parallel sentence at training time.

Posts, Articles, Tutorials

  1. “The State of Data Science & Machine Learning”. An industry-wide survey by Kaggle based on 16,000 responses.
  2. “Approaching (Almost) Any NLP Problem on Kaggle”.
  3. “Out-of-sample prediction for linear model with missing data” by Junpeng Lao using PyMC3.
  4. “Word embeddings in 2017: Trends and future directions” by Sebastian Ruder.
  5. “How Adversarial Attacks Work” by Roman Trusov.
  6. “Bayesian Decision Theory Made Ridiculously Simple”.
  7. “Selected papers structured by NLP task” by Kyubyong Park.
  8. “Colorizing B&W Photos with Neural Networks” by Emil Wallnér.
  9. “Speech Recognition Is Not Solved” I.e. accents and noise, semantic errors, multiple speakers etc.
  10. “The End of Human Doctors — The Bleeding Edge of Medical AI Research”.
  11. “Probabilistic programming: an annotated bibliography”.

Video Lectures and Talks

  1. “PyTorch Zero to All Crash Course” by Sung Hu Kim. Slides. Code.
  2. “Recent Advances, Frontiers and Future of Deep RL”. Talk at UC Berkeley Deep RL Bootcamp by DeepMind instructor.
  3. “Hands-on PyTorch workshop” by Luca Antiga Soumith Chintala.
  4. Video walkthrough of each chapter of “Deep Learning Book” by Ian Goodfellow.
  5. “Information Theory of Deep Learning” talk by Prof. Naftali Tishby at Yandex.
  6. “PyTorch: Fast Differentiable Dynamic Graphs in Python” by Soumith Chintala.
  7. Videos from Cognitive Computational Neuroscience (CCN) 2017.
  8. “Deep learning and Backprop in the Brain” by Yoshua Bengio CCN2017.

Free Books

  1. “Introduction to High-Performance Scientific Computing” by Victor Eijkhout. Contains both theory and practical tutorials.

You would also like

Facebook Group: Machine Learning Review