1 line of Python code for BERT, ALBERT, ELMO, ELECTRA, XLNET, GLOVE, Part of Speech with NLU and t-SNE

Deep insights with John Snow Labs NLU library and minimalistic effort!

The amount of progress NLP has seen in recent years is tremendous, hundreds of powerful classifiers and new embeddings have been created by the smartest data scientists of the planet.
Accessing their research results and putting it to use can sometimes be a really tough challenge. This is why John Snow Labs’ NLU was created.

You just type

That’s it! Your data could be :

  • Pandas dataframe
  • Modin dataframe
  • Spark dataframe
  • Python string
  • List of Python Strings
  • Numpy Array of Strings

This gives you endless possibilities to explore further!

  • Train new classifiers which achieve state of the art results by leveraging the latest and greatest of Deep Learning Embeddings
  • Build whole applications around the NLU features
  • Visualize and compare embeddings with t-SNE or PCA and to gain insight and explore your dataset

The following shows a simple yet impressive visual comparison between the different embeddings, achievable with one line of NLU code and a little bit of plotting code!

All the NLU code you need for this is

If you want to make your own t-SNE visualizations with any of NLU’s embedding models, check out the following article and the accompanying notebook to this article BERT, ALBERT, ELMO, ELECTRA, XLNET, GLOVE and t-SNE plotting

More NLU Medium articles

More about NLU

What we have learned

  • How to get BERT Word Embeddings in Python with NLU in 1 line
  • How to get ELECTRA Word Embeddings in Python with NLU in 1 line
  • How to get ALBERT Word Embeddings in Python with NLU in 1 line
  • How to get ELMO Word Embeddings in Python with NLU in 1 line
  • How to get XLNET Embeddings in Python with NLU in 1 line
  • How to get GLOVE Embeddings in Python with NLU in 1 line
  • How to get Part Of Speech Labels in Python with NLU in 1 line

Data Science, Big Data, Data Engineering, DevOps expert