ACL 2019 Announces Best Papers; Nature Covers Tianjic Chip; Baidu Releases ERNIE 2.0

Synced
SyncedReview
Published in
3 min readAug 4, 2019

ACL 2019 | Best Papers Announced
The Association for Computational Linguistics (ACL) held its 57th annual meeting July 28 to August 2 in Florence, Italy. The ACL 2019 organizing committee announced its eight paper awards: Best Long Paper, Best Short Paper, Best Demo Paper, and five Outstanding Paper awards.
(Synced)

Nature Cover Story | Chinese Team’s ‘Tianjic Chip’ Bridges Machine Learning and Neuroscience in Pursuit of AGI
Respected scientific journal Nature boosted the case for AGI with a cover story on a new research paper, Towards artificial general intelligence with hybrid Tianjic chip architecture, which aims to stimulate AGI development by adopting generalized hardware platforms.
(Synced) / (Nature)

Baidu’s ERNIE 2.0 Beats BERT and XLNet on NLP Benchmarks
The updated ERNIE 2.0 model outperforms BERT and the recent XLNet (a generalized autoregressive pretraining model) on 16 tasks including English GLUE benchmarks and various Chinese language tasks.
(Synced) / (Baidu) / (GitHub)

Technology

∂P: A Differentiable Programming System to Bridge Machine Learning and Scientific Computing
Researchers describe a Differentiable Programming (∂P) system that is able to take gradients of Julia programs making Automatic Differentiation a first class language feature. This system supports almost all language constructs and compiles high-performance code without requiring any user intervention or refactoring to stage computations.
(Julia Computing & MIT & Invenia Labs)

DeepMind Using AI to Give Doctors a 48-Hour Head Start on Life-Threatening Illness
Researchers develop a deep learning approach for the continuous risk prediction of future deterioration in patients, building on recent work that models adverse events from electronic health records and using acute kidney injury — a common and potentially life-threatening condition — as an exemplar.
(DeepMind) / (Nature)

On Mutual Information Maximization for Representation Learning
Researchers argue, and provide empirical evidence, that the success of these methods might be only loosely attributed to the properties of MI, and that they strongly depend on the inductive bias in both the choice of feature extractor architectures and the parametrization of the employed MI estimators.
(Google Research)

You May Also Like

Robot Masters String Puppetry
A group of researchers from ETH Zurich has introduced a robotic system that can perform animation of real-world string puppets, aka marionettes. “PuppetMaster” operates puppets using the traditional method of control bars attached by wires or strings to a puppet hanging below.
(Synced)

Facebook AI Memory Layer Boosts Network Capacity by a Billion Parameters
Neural networks are widely used in complex tasks such as machine translation, image classification, or speech recognition. These networks are data driven, and as the amount of data increases so does network size and the computational complexity required for training and inference.
(Synced)

Global AI Events

August 19–23: Knowledge Discovery and Data Mining (KDD2019) in London, United Kingdom
September 10–12: The AI Summit (Part of TechXLR8) in Singapore
September 24–28: Microsoft Ignite in Orlando, United States
October 27-November 3: International Conference on Computer Vision (ICCV) in Seoul, South Korea

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global or daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global