Facebook Transfer Learning Method Boosts Code Autocompletion Accuracy by Over 50%

Synced
SyncedReview
Published in
4 min readMay 18, 2021

--

Autocompletion, where an application predicts the next item in a text input, has become a convenient and widely used tool in contemporary messaging and other writing tasks. It is also one of the most important features of an integrated development environment (IDE) for computer programming. Recent research has shown that autocompletion can be powered by deep learning, enabling software language models to achieve significant accuracy improvements by training on real-world datasets collected from programmers’ IDE activity. A common issue with less popular programming languages however is that the available IDE datasets may be insufficient for training.

In the paper Improving Code Autocompletion with Transfer Learning, a research team from Facebook shows how the power of transfer learning can enable pretraining on non-IDE, non-autocompletion, and different-language example code sequences before fine-tuning on the autocompletion prediction task. The proposed approach improves model accuracy by over 50 percent on very small fine-tuning datasets and over 10 percent on 50k labelled examples.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global