Xu MengXianginKeyReplyXLNet — A new pre-training method outperforming BERT on 20 tasksIn 2018, Google published bidirectional, transformer-based pre-training of large scale language model BERT, breaking 11 state-of-the-art…4 min read·Jun 20, 2019--1--1
Xu MengXiangNot everyone needs to be a mechanic [repost]We’re constantly reading articles asking whether kids should be taught coding. But we shouldn’t be aiming for a future where everyone needs…1 min read·Apr 30, 2016----