This is going to be a short one. While others argue over GPT-2, we had to dig a little deeper into Google BERT. If you are a follower of deep learning research, especially on NLP side, BERT needs no introduction at this point in time. Since its initial release last October, dozens more resources have been added highlighting the significance of this work by Google. The official blog post itself is a very good resource to understand the basics.
Besides the significance, I think the code release with good documentation (a not-so-common practice in research, sadly) immensely contributed to the popularity it gained in such a short period. The repository covered almost all the essentials, with very few exceptions. One was online prediction support. While I was searching through the issues and pull requests, I found this very useful issue comment. For our requirement, we had to implement an additional data processor for SST-2 (we used a custom version for the actual use case), a helper class for online prediction and then use them in a pipeline with multiple BERT models. Considering BERT’s popularity in multiple downstream NLP tasks, we thought of sharing our extensions with everyone. Of course this is far from a proper code release and I wouldn’t call it production-ready. But in case you were looking for the same thing, hope this would help. I have added a Jupyter Notebook to demonstrate the usage of prediction.py. Hope it is simple enough, but do send a PR if you notice anything that needs to be changed.
Thanks @brightmart for opening the issue and for sharing your script! Also kudos to BERT authors for their amazing work!
Contribute to cdathuraliya/bert-inference development by creating an account on GitHub.
Thanks for reading. Follow our blog for more content on AI research and engineering.