Download Huggingface models

Irene Zhou
1 min readMay 5, 2023
https://huggingface.co/brand

Bash script:

https://github.com/IreneZhou0129/download_huggingface_models

Why is downloading models helpful?

One common way to use these online models is,

from transformers import BertTokenizer, BertModel

tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained('bert-base-uncased')

When executing the code, requests are sent, and the model is downloaded in each instance.

Downloading pytorch_model.bin:   0%|          | 0.00/47.4M [00:00<?, ?B/s]
Downloading pytorch_model.bin: 22%|██▏ | 10.5M/47.4M [00:00<00:00, 41.7MB/s]
Downloading pytorch_model.bin: 44%|████▍ | 21.0M/47.4M [00:00<00:00, 29.9MB/s]
Downloading pytorch_model.bin: 66%|██████▋ | 31.5M/47.4M [00:00<00:00, 38.3MB/s]
Downloading pytorch_model.bin: 89%|████████▊ | 41.9M/47.4M [00:01<00:00, 42.3MB/s]
Downloading pytorch_model.bin: 100%|██████████| 47.4M/47.4M [00:01<00:00, 41.5MB/s]

However, working on a remote server may present network connection challenges, leading to instability and a time-consuming process.

Traceback (most recent call last):
File "/my_project/venv/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "/my_project/venv/lib/python3.10/site-packages/urllib3/util/connection.py", line 95, in create_connection
raise err
File "/my_project/venv/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
sock.connect(sa)
OSError: [Errno 101] Network is unreachable

Consequently, we want to download the models to our workplace. This allows us to run experiments more efficiently by utilizing locally stored models. ✌🏼

--

--

Irene Zhou

MSc in CS (NLP, IR) @ Université de Montréal ; 小红书:小歆Irene | WeChat: xiaoxin_irene | X: Irene_in_Web3 | Instagram/Thread: irene_eight | Discord: irene_8