🤗 Transformers is an opinionated library built for:
The library was designed with two strong goals in mind:
Be as easy and fast to use as possible:
from_pretrained() instantiation method which will take care of downloading (if needed), caching and
loading the related class instance and associated data (configurations’ hyper-parameters, tokenizers’ vocabulary,
and models’ weights) from a pretrained checkpoint provided on Hugging Face Hub or your own saved checkpoint.Provide state-of-the-art models with performances as close as possible to the original models:
A few other goals:
Expose the models’ internals as consistently as possible:
Incorporate a subjective selection of promising tools for fine-tuning/investigating these models:
Switch easily between PyTorch and TensorFlow 2.0, allowing training using one framework and inference using another.
The library is built around three types of classes for each model:
All these classes can be instantiated from pretrained instances and saved locally using two methods:
from_pretrained() lets you instantiate a model/configuration/tokenizer from a pretrained version either
provided by the library itself (the supported models are provided in the list here) or
stored locally (or on a server) by the user,save_pretrained() lets you save a model/configuration/tokenizer locally so that it can be reloaded using
from_pretrained().