Exploring HuggingFace Transformers For NLP With Python

LZP Data Science
Geek Culture
Published in
6 min readMar 10, 2022

--

The overall growth of internet adoption is expected to increase in the coming years. A forecast by Cisco (Cisco. 2022, January 23) sees that 66% of the global population will have access to the internet by 2023.

With that in mind, the amount of unstructured data produced in text files, emails, etc., is also likely to increase. As such, a growing subfield concerning text-related data called Natural Language Processing (NLP) exists to extract insights from such forms of data.

According to a report by Mordor Intelligence (Mordor Intelligence, 2021), the NLP market size is also expected to be worth USD 48.46 billion by 2026, registering a CAGR of 26.84% from the years 2021 to 2026.

This post would be exploring how we can use a simple pre-trained transformer language model for some everyday NLP tasks in Python.

Let’s start by installing the Hugging Face transformers package. To avoid any dependency issues moving forward, one can install it in a virtual environment.

pip install transformers

Once done, let’s look at performing some simple text classification.

Text Classification

For every application of hugging face transformers. A pipeline would first have to be…

--

--