Deep Hunt — Issue #61
It’s been a relaxed and fun holiday week — Fellowship AI program expands worldwide; Yoshua Bengio says Universities deserve more credit for AI research; Model Depot repository; Stochastic Adaptive Neural Architecture Search for Keyword Spotting
News
If you wanted to make a career in Machine Learning, checkout Fellowship.AI. They are now expanding worldwide and 26th Nov is the last day to apply for Jan 2019 cohort.
Articles
AI Pioneer Yoshua Bengio Says Universities Deserve More Credit
There’s no doubt that Universities are a key contributor to the AI research, but why aren’t they given enough credit? See what Yoshua Bengio has to say.
Tutorials, Tips and Tricks
At some point of time, we all search for the various opensource ML models, maybe to try it out, see how it works. So, here’s a platform for discovering, sharing, and discussing easy to use and pre-trained machine learning models.
We saw Big GANs generator was opensourced last week and the community has gone wild ever since. Many researchers have been playing around with this collaborative tool to explore the latent spaces and reporting strange sightings!
Tencent recently put out the largest open-source multi-label image database, including 17,609,752 training and 88,739 validation image URLs, which are annotated with up to 11,166 categories. Will this be the new ImageNet?
Research
Night Sight: Seeing in the Dark on Pixel Phones
Google explains how computational photography and machine learning can be used to create a practical application for end users in this relatively layman terms blogpost.
Stochastic Adaptive Neural Architecture Search for Keyword Spotting
Here’s an interesting research that explores how to save computational power by learning to choose the neural network architecture at each time-step. You can also go through their code implementation on Github.
If you like what you are reading, please follow and recommend to your friends or give a shoutout on Twitter! I’m glad to hear your suggestions and recommendations @deephunt_in or in comments below!