MERT-v1-95M not compatible with Transformers >=4.44.0
#4
by
baobaoh
- opened
AttributeError: 'MERTConfig' object has no attribute 'conv_pos_batch_norm'
Unless transformers is downgraded to 4.44.0 or earlier, this attribute prevents the model from running.
This attribute is from the HubertConfig, and it is set by default to False, you can bypass it by doing:
config = AutoConfig.from_pretrained("m-a-p/MERT-v1-330M", trust_remote_code=True)
config.conv_pos_batch_norm = False
and then loading the model with: model = AutoModel.from_pretrained("m-a-p/MERT-v1-330M", config = config, trust_remote_code=True)
Here is Hubert's config: https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/configuration_hubert.py