🦄🤝🦄 Encoder-decoders in Transformers: a hybrid pre-trained architecture for seq2seq

How to use them with a sneak peak into upcoming features 🕵️‍♀️

Rémi Louf
Dec 3, 2019 · 9 min read

Our Transformers library implements many (11 at the time of writing) state-of-the-art transformer models. It is used by researchers and practitioners alike to perform tasks such as text classification, named entity recognition, question answering or text generation. Its API is compatible with both PyTorch and Tensorflow.