Update metadata: add pipeline_tag and set library_name to transformers

#1
by nielsr HF Staff - opened

This PR improves the model card by:

  • Adding the pipeline_tag: text-generation to accurately categorize the model's functionality on the Hub, as it aligns with the paper's description of 'Continuous Autoregressive Language Models' and 'next-token prediction'. This will enhance model discoverability.
  • Updating the library_name from CALM to transformers. Evidence from config.json (e.g., transformers_version: "4.43.0") and tokenizer_config.json (tokenizer_class: "PreTrainedTokenizerFast") indicates compatibility with the transformers library, likely used with trust_remote_code=True for custom architectures. This change will enable the automated transformers code snippet for easier model usage on the Hub.

No sample usage code is added as the provided GitHub README does not contain a Python inference snippet, adhering to the guideline not to make up code. The existing links to the paper, GitHub, and project page are retained.

cccczshao changed pull request status to merged

Sign up or log in to comment