In Diffusers, schedulers of type schedulers.scheduling_utils.SchedulerMixin, and models of type ModelMixin inherit from ConfigMixin which conveniently takes care of storing all parameters that are
passed to the respective __init__ methods in a JSON-configuration file.
TODO(PVP) - add example and better info here
Base class for all configuration classes. Stores all configuration parameters under self.config Also handles all
methods for loading/downloading/saving classes inheriting from ConfigMixin with
Class attributes:
str) — A filename under which the config should stored when calling
save_config() (should be overridden by parent class).List[str]) — A list of attributes that should not be saved in the config (should be
overridden by parent class).List[str]) — A list of classes that are compatible with the parent class, so that
from_config can be used from a class different than the one used to save the config (should be overridden
by parent class).( pretrained_model_name_or_path: typing.Union[str, os.PathLike] return_unused_kwargs = False **kwargs )
Parameters
str or os.PathLike, optional) —
Can be either:
google/ddpm-celebahq-256../my_model_directory/.Union[str, os.PathLike], optional) —
Path to a directory in which a downloaded pretrained model configuration should be cached if the
standard cache should not be used.
bool, optional, defaults to False) —
Whether or not to raise an error if some of the weights from the checkpoint do not have the same size
as the weights of the model (if for instance, you are instantiating a model with 10 labels from a
checkpoint with 3 labels).
bool, optional, defaults to False) —
Whether or not to force the (re-)download of the model weights and configuration files, overriding the
cached versions if they exist.
bool, optional, defaults to False) —
Whether or not to delete incompletely received files. Will attempt to resume the download if such a
file exists.
Dict[str, str], optional) —
A dictionary of proxy servers to use by protocol or endpoint, e.g., {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request.
bool, optional, defaults to False) —
Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages.
bool, optional, defaults to False) —
Whether or not to only look at local files (i.e., do not try to download the model).
str or bool, optional) —
The token to use as HTTP bearer authorization for remote files. If True, will use the token generated
when running transformers-cli login (stored in ~/.huggingface).
str, optional, defaults to "main") —
The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a
git-based system for storing models and other artifacts on huggingface.co, so revision can be any
identifier allowed by git.
str, optional, defaults to "") —
In case the relevant files are located inside a subfolder of the model repo (either remote in
huggingface.co or downloaded locally), you can specify the folder name here.
Instantiate a Python class from a pre-defined JSON-file.
It is required to be logged in (huggingface-cli login) when you want to use private or gated
models.
Activate the special “offline-mode” to use this method in a firewalled environment.
( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )
Save a configuration object to the directory save_directory, so that it can be re-loaded using the
from_config() class method.