Why Hugging Face is Essential for Every Machine Learning Programmer
All about open-source modeling
HuggingFace is a recent development in Computer Science history. If you look 4 years back at previous states of huggingface.co, you will find not even a quarter of the services built by HuggingFace exist.
Every year more and more models, features, abstractions and research papers are published through HuggingFace as new standards in machine learning.
Every other month new models deprecate old ones. Llama 2 was introduced in mid 2023 but by the end of 2023 was outperformed by Mistral. By February 2024 Mixtral outperformed Mistral.
We have years of progression whereby deep learning models are outperformed quickly after. Now tiny LLMs perform better than last year’s large ones.
We are coming up with all kinds of methods to compress knowledge into small parameter counts. And, in great majority it is thanks to the open-source nature of HuggingFace
Why use HuggingFace?
This web application, API, repository and framework is our new standard in open-source machine learning. They developed deep and significant abstractions for transformers, to a point where it only takes a hundred lines of code to start training new LLMs.