ICML Announces Best Paper Awards; What to Expect at CVPR 2019

Synced
Synced
Jun 16 · 3 min read

ICML 2019 | Google, ETH Zurich, MPI-IS, Cambridge & PROWLER.io Share Best Paper Honours
ICML announced the recipients of the Best Paper Awards: from Google Research, ETH Zurich, and Max Planck Institute for Intelligent Systems; and from the University of Cambridge and PROWLER.io.



What to Expect at CVPR 2019
- Microsoft at CVPR 2019
- IBM Research AI at CVPR 2019
- Baidu at CVPR 2019
- NVIDIA Research at CVPR 2019
- Intel AI Research at CVPR 2019
- Facebook AI | Creating 2.5D Visual Sound for An Immersive Audio Experience
- 2019 CVPR Accepted Papers Organized in A Parsable and Easier to Sort Through Way

What Does BERT Look At? An Analysis of BERT’s Attention
Researchers propose methods for analyzing the attention mechanisms of pre-trained models and apply them to BERT. BERT’s attention heads exhibit patterns such as attending to delimiter tokens, specific positional offsets, or broadly attending over the whole sentence, with heads in the same layer often exhibiting similar behaviors.


Are Weights Really Important to Neural Networks?
As with the age-old “nature versus nurture” debate, AI researchers want to know whether architecture or weights play the main role in the performance of neural networks. In a blow to the “nurture” side, Google researchers have now demonstrated that a neural network which has not learned weights through training can still achieve satisfactory results in machine learning tasks.


Episodic Memory in Lifelong Language Learning
Researchers introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. They propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup.

New Facebook PyTorch Hub Facilitates Reproducibility Testing
In a bid to provide a smoother reproduction experience, Facebook has announced the beta release of PyTorch Hub, a new pretrained model repository designed to facilitate research reproducibility testing.


BatchNorm + Dropout = DNN Success!
A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent.

June 15–21: Computer Vision and Pattern Recognition in Long Beach, United States

June 20–21: AI for Good Summit in San Francisco, United States

June 28: Research and Applied AI Summit (RAAIS) in London, United Kingdom

August 19–23: Knowledge Discovery and Data Mining (KDD2019) in London, United Kingdom

Research Scientist, Google Brain Toronto

OpenAI Seeking Software Engineers and Deep Learning Researchers

DeepMind is Recruiting

DeepMind Scholarship: Access to Science

Postdoctoral Researcher (AI) — Self-Supervised Learning

LANDING AI is recruiting



is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.


Follow us on Twitter @Synced_Global for daily AI news!


We know you don’t want to miss any stories. Subscribe to our popular to get weekly AI updates.

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Synced

Written by

Synced

AI Technology & Industry Review — syncedreview.com | Newsletter: goo.gl/Q4cP3B | Become Synced Insight Partner: goo.gl/ucXZDw | Twitter: @Synced_Global

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.