Running Pytorch-Transformers on Custom Datasets

Photo by Tim Meyer on Unsplash


Github links to pytorch-transformers repo & my extension code


  1. To execute pytorch-transformer on IMDB dataset, download above two files in a folder of your choice
  2. Set the IMDB_DIR enviroment variable to where your IMDB dataset is present. For e.g. export IMDB_DIR=~/data/aclImdb
  3. Run command:
$ python --task_name imdb --do_train --do_eval --do_lower_case --data_dir $IMDB_DIR/ --model_type bert --model_name_or_path bert-base-uncased --max_seq_length 128 --learning_rate 2e-5 --num_train_epochs 3.0 --output_dir /tmp/imdb_output/


I have used single Tesla V100 GPU on GCP. So far I have only experimented with BERT model. Following is a quick summary of results obtained during different runs.

Fine-tuning of BERT Language Model

I used the unsupervised data (train/unsup folder) from the IMDB dataset to finetune the language model.

$ python --input_dir ~/data/aclImdb/train/unsup --output_file imdb_corpus.txt
$ sed ‘/^.$/d’ imdb_corpus.txt > imdb_corpus_1.txt
# First command to generate training data:
# ==========================================
$ python ~/huggingface/pytorch-transformers/examples/lm_finetuning/ --train_corpus lm_finetuning/imdb_corpus_1.txt --bert_model bert-base-uncased --do_lower_case --output_dir lm_finetuning/training --epochs_to_generate 3 --max_seq_len 128
# Second command to create the fine-tuned language model:
# =======================================================
$ python ~/huggingface/pytorch-transformers/examples/lm_finetuning/ --pregenerated_data lm_finetuning/training/ --bert_model bert-base-uncased --do_lower_case --output_dir lm_finetuning/finetuned_lm/ --epochs 3
# Finally, re-train on fine-tuned model:
# ======================================
$ python --task_name imdb --do_train --do_eval --do_lower_case --data_dir $IMDB_DIR/ --model_type bert --model_name_or_path ~/nlp_projects/examples/lm_finetuning/finetuned_lm --max_seq_length 512 --learning_rate 2e-5 --num_train_epochs 4.0 --gradient_accumulation_steps=4 --output_dir /tmp/imdb_output_11/ --save_steps 1000

Running in Mixed-Precision mode (FP16)

After a bit of a struggle, I could finally get the FP16 mode working. First up you need to install the CUDA toolkit and then make sure the path to ‘nvcc’ utility is added to $PATH (check using nvcc — version)

$ git clone
$ cd apex
$ pip install -v --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" .
ERROR: Command "/home/nikhil_subscribed/anaconda3/bin/python -u -c 'import setuptools, tokenize;__file__='"'"'/tmp/pip-req-build-ft6absjv/'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);'"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' --cpp_ext --cuda_ext install --record /tmp/pip-record-5uvui08_/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-req-build-ft6absjv/
"""if (bare_metal_major != torch_binary_major) or (bare_metal_minor != torch_binary_minor):
raise RuntimeError("Cuda extensions are being compiled with a version of Cuda that does " +
"not match the version used to compile Pytorch binaries. " +
"Pytorch binaries were compiled with Cuda {}.\n".format(torch.version.cuda) +
"In some cases, a minor-version mismatch will not cause later errors: " +
" "
"You can try commenting out this check (at your own risk).")"""

Next Steps

While it is tempting to try and get higher & higher accuracy numbers and a combination of distributed training, FP16, bert-large model, max_seq_length of 512 should definitely do that but as this point I want to focus on learning and deploying end-to-end applications. So next up I’ll be looking into deploying these models into production.

Further Reading

I didn’t talk about “What is Transformer” etc in this post because not only am I still wrapping my head around it but also because nobody can explain it as well as Jay Alammar has done in this post.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store