Xu MengXianginKeyReplyXLNet — A new pre-training method outperforming BERT on 20 tasksIn 2018, Google published bidirectional, transformer-based pre-training of large scale language model BERT, breaking 11 state-of-the-art…Jun 20, 20191Jun 20, 20191
Xu MengXiangNot everyone needs to be a mechanic [repost]We’re constantly reading articles asking whether kids should be taught coding. But we shouldn’t be aiming for a future where everyone needs…Apr 30, 2016Apr 30, 2016