Introduction to Hugging Face: A Starter’s Guide to Using Online Models

Anthony Demeusy
9 min readJan 18, 2024

--

Hugging Face has emerged as a prominent player¹ in the ever-evolving landscape of machine learning and artificial intelligence. Its versatile capabilities cater to a spectrum of AI practitioners, ranging from individual contributors to large global enterprises, accommodating users at all skill levels, from AI novices to seasoned professionals and researchers.

At the time of writing of this article, the introductory content on the Hugging Face website offers only a limited perspective, expressed in phrases such as “The AI community building the future” and “We are on a mission to democratize good machine learning”. This information provides a glimpse into the platform’s overarching goals and mission but lacks specific details about its features or functionalities.

Hugging Face’s organization card

Sometimes drawing comparisons to being a “GitHub for AI”², Hugging Face’s influence surpasses the brevity of these descriptions. To better illustrate its functionalities, this introductory article aims to shed light on four practical ways to begin your journey with Hugging Face and make use of online models. By exploring hands-on examples, this article provides actionable insights that showcase the platform’s utility for newcomers.

Using Spaces

When available, the easiest way to use an online model hosted under Hugging Face is to use Spaces. You can think of Spaces as a simple graphical user interface that allows you to provide input data and obtain the outcome.

Let’s assume you are looking for a text sentiment analysis solution for the some customer feedback. Here are a few examples from the dataset :

  • “The seamless integration of new features in the latest update enhances the overall user experience, showcasing the company’s commitment to continuous improvement.”
  • “Despite multiple attempts to troubleshoot, the persistent software glitches are frustrating and severely impact productivity, requiring urgent attention from the development team.”
  • “The recent product upgrade not only met but exceeded my expectations, demonstrating the company’s dedication to delivering high-quality solutions.”
  • “The lack of timely communication regarding the service outage left users feeling uninformed and frustrated, highlighting a significant need for improvement in customer communication strategies.”
  • “The standard shipping option met delivery expectations, neither impressing nor disappointing, providing a reliable and predictable service experience.”

sFirst, open Hugging Face, login or create your account, and simply type “customer sentiment analysis.”

You can review all Spaces matching this search or select one from the top results. For the purpose of this article, we’ll use the first Space ‘Lettria’s Customer Sentiment Analysis.

From here, you can submit customer review one by one and obtain the sentiment analysis outcome generated by a model running in the background.

While Spaces is a useful and user-friendly initial step, it relies solely on manual operations. However, automation and integration into a complete workflow are often necessary. Thankfully, you can also interact with the Space via an API.

Using Spaces API

Not very visible, in the footer, you can see a ‘use via API’ shortcut.

This gives access to code that you can re-use in Python and Javascript.

For instance, in Python, you need to run the first line of code once in order to install the gradio_client Python library in your environment, if it is not already available.

pip install gradio_client

Then to call the API, you can copy code from the second block of code, and create a variable or directly replace the input by the text you want to analyze. For instance, you can run:

from gradio_client import Client

customer_review = "The seamless integration of new features in the latest update enhances the overall user experience, showcasing the company's commitment to continuous improvement."

client = Client("https://lettria-customer-sentiment-analysis.hf.space/")
result = client.predict(
customer_review, # str in 'Customer Review' Textbox component
api_name="/predict"
)
print(result)

which returns a result similar to:

C:\[…]\Temp\gradio\tmp0ikbepfw.json

As you can see, the result variable is not directly the sentiment analysis result. Instead, this returns a path where a file containing the output of the model has been downloaded in your local environment. This approach makes sense when you consider that the outcome may not be in a text-based format, for instance an image or an audio file. Therefore, you will need to load the outcome using the appropriate function or library, depending on the format, if you want to use it later in a workflow. In this case, you can use:

import json
f = open(result)

data = json.load(f)
print(data)

f.close()

and the outcome is: {‘label’: ‘POSITIVE’, ‘confidences’: [{‘label’: ‘POSITIVE’, ‘confidence’: 0.711}]}

Finally, you may want to include a step to delete the downloaded files.

import os
os.remove(result)

Nonetheless, there are instances where no Space is available, or you may prefer not to save a file in your local environment. In these situations, alternative solutions such as inference points can often be utilized.

Using inference points

Other models in Hugging Face offer an inference instead of, or in addition of Spaces. To use them, you first need to create an API key. For this, open the settings

  • Click Access Tokens
  • Hit the New token button
  • Give it a name and a role — to use inference points, a ‘read’ role is sufficient — and hit Generate a token

Now that you have this token at your disposal, let’s say you want to use the model ‘mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis’ (what a long name !)

  • Under the Deploy dropdown button, select Inference API.
  • Then simply select your language of choice

When using the Copy button, there’s no need to replace the API token in the code. You directly obtain a fully functional code. Then, you can customize it, replace the input to be evaluated by another string or a variable, embed it into a function, or integrate it into a larger program. For instance, for the customer feedback use-case, a basic code could be:

import requests

text_input = "The seamless integration of new features in the latest update enhances the overall user experience, showcasing the company's commitment to continuous improvement."

API_URL = "https://api-inference.huggingface.co/models/mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis"
headers = {"Authorization": "Bearer YOUR_TOKEN_GOES_HERE"}

def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()

output = query({"inputs": "{}".format(text_input),})

for e in output[0]:
print("{} score : {}%".format(e['label'], round(e['score']*100,2)))

Which yields :

Though this method for using a model online involves creating an API token, it remains relatively easy to implement. Nonetheless, certain models, in particular large models, do not offer an inference point. In this case, using the Transformers library may be an option.

Using Transformers

In Hugging Face’s words, the Transformers is a Python library “maintained by Hugging Face and the community” that “provides thousands of pretrained models to perform tasks on different modalities such as text, vision, and audio.” It is an extremely useful library but you may find that using a specific model with this library is less straightforward and requires additional dependencies, such as PyTorch and TensorFlow.

To use it for sentiment analysis, first make sure the library is installed by running :

pip install transformers

Then, you can directly perform sentiment analysis

from transformers import pipeline
classifier = pipeline("sentiment-analysis")
classifier("The seamless integration of new features in the latest update enhances the overall user experience, showcasing the company's commitment to continuous improvement."")

Nonetheless, you can observe the outcome is not the same as we obtained with the inference point. This is due to the fact that no model was indicated, so the library called a default model for the sentiment analysis.

If you want to specify a model, open the model page and click Use in Transformers.

From there, you have access to 2 different version of code. The first version of code use pipelines, which “abstract most of the complex code from the library”. The second version of the code is more complex to implement as you will need to delve into the model documentation but gives more control

In this use-case, you can use a pipeline as an easy solution to use the model.

These two lines of code create a ‘pipe’ object. Then, you can simply provide your input to this ‘pipe’ object, for instance using the following code :

# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis")

pipe("The seamless integration of new features in the latest update enhances the overall user experience, showcasing the company's commitment to continuous improvement.")

which returns: [{‘label’: ‘positive’, ‘score’: 0.9996284246444702}]

As you can observe, the outcome is consistent with the results obtained using the inference point.

Conclusion

Hugging Face’s diverse set of tools, including Spaces, APIs, inference points, and the Transformers library, empowers users to easily access and utilize a wide range of online models. However, it’s important to note that while the methods presented in this article are excellent for initial experimentation, they may not be available for all models and may not be the most suitable for production-ready, scalable solutions.

Besides, it’s worth highlighting that the platform offers a comprehensive suite of tools beyond just accessing pre-trained models. From training and fine-tuning to deployment and sharing, Hugging Face provides an efficient end-to-end solution for individuals and organizations looking to work with machine learning and artificial intelligence models.

If you found this article helpful, please show your support by clapping for this article and considering subscribing for more articles on machine learning and data analysis. Your engagement and feedback are highly valued as they play a crucial role in the continued delivery of high-quality content.

You can also support my work by buying me a coffee. Your support helps me continue to create and share informative content. It’s a simple and appreciated gesture that keeps the momentum going : Buy Me a Coffee.

References

[1] Leswing, Kif. “Google, Amazon, Nvidia and Other Tech Giants Invest in AI Startup Hugging Face, Sending Its Valuation to $4.5 Billion.” CNBC, 24 Aug. 2023, www.cnbc.com/2023/08/24/google-amazon-nvidia-amd-other-tech-giants-invest-in-hugging-face.html.

[2] “Hugging Face, “GitHub for AI”, Raises $235M from Nvidia, Amazon and Other Tech Giants — TFN.” Tech Funding News, 25 Aug. 2023, techfundingnews.com/hugging-face-github-for-ai-raises-235m-from-nvidia-amazon-and-other-tech-giants/. Accessed 15 Jan. 2024.

--

--

Anthony Demeusy

Project Manager, Biomedical Engineer, Data science & AI enthusiast