Ring in the New Year With These Free AI Resources
Open-Sourced Datasets | 2019 In Review: 10 Open-Sourced AI Datasets
High-quality data is the fuel that keeps the AI engine running — and the machine learning community can’t get enough of it. In the conclusion to our year-end series, Synced spotlights ten datasets that were open-sourced in 2019.
AI YouTube Channels | 2019 in Review: 10 Essential AI YouTube Channels
The year 2019 saw unprecedented growth in YouTube educational content on artificial intelligence. Synced has selected 10 AI-oriented YouTube channels we hope might provide our readers a cozy little holiday binge-watching session.
Online Courses | Free AI Courses
The Elements of AI free online course
Deeplearning.ai | Deep Learning
Deeplearning.ai | TensorFlow in Practice
Deeplearning.ai | TensorFlow: Data and Deployment
Deeplearning.ai | AI For Everyone
10 Free Top Notch Natural Language Processing Courses
Top AI Courses on Coursera
Top AI Courses on Udacity
AI Conferences | Conference You Shouldn’t Miss
January 7–10 | 2020 International CES
February 7–12 | The Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20)
Apr 26–30 | The International Conference on Learning Representations (ICLR 2020)
May 31-Jun 4 | International Conference on Robotics and Automation (ICRA 2020)
June 16–18 | 2020 Conference on Computer Vision and Pattern Recognition (CVPR 2020)
June 22–23 | International Conference on Computer Vision (ICCV 2020)
Jul 12–17 | Thirty-seventh International Conference on Machine Learning (ICML 2020)
August 23–28 | The 2020 European Conference on Computer Vision (ECCV 2020)
TBD | The Annual Conference on Neural Information Processing Systems
Reformer: The Efficient Transformer
Researchers have introduced two techniques to improve the efficiency of Transformers. The resulting model, the Reformer, performs on par with Transformer models while being much more memory-efficient and much faster on long sequences.
Continual Learning with Hypernetworks
Artificial neural networks suffer from catastrophic forgetting when they are sequentially trained on multiple tasks. To overcome this problem, researchers present a novel approach based on task-conditioned hypernetworks, i.e., networks that generate the weights of a target model based on task identity.
(ETH Zürich and University of Zürich)
Neural Painters: A Learned Differentiable Constraint for Generating Brushstroke Paintings
Researchers explore neural painters, a generative model for brushstrokes learned from a real non-differentiable and non-deterministic painting program. The research team shows that when training an agent to “paint” images using brushstrokes, using a differentiable neural painter leads to much faster convergence.
You May Also Like
CMU Senior Develops World’s First Classical Chinese Programming Language
In an attempt to add some diversity to the range of available programming languages, Carnegie Mellon University computer science major Lingdong Huang has developed ‘Wenyan-Lang,’ a programming language based on Chinese hanzi characters and the wenyan classical Chinese grammar system.
Facebook PointRend: Rendering Image Segmentation
Facebook AI Research team has introduced a new “point-based rendering” neural network module with an iterative subdivision algorithm that can integrate SOTA image segmentation models.
Global AI Events
January 7–10: CES 2020 in Las Vegas, United States
February 7–12: AAAI 2020 in New York, United States
February 24–27: Mobile World Congress in Barcelona, Spain
March 23–26: GPU Technology Conference (GTC) in San Jose, United States
Global AI Opportunities
We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.
Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!
2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.