Translation
Transformers
Safetensors
qwen3
text-generation
text-generation-inference
nielsr HF Staff commited on
Commit
b67381e
·
verified ·
1 Parent(s): a39db1c

Improve model card: Add `library_name`, update `pipeline_tag`, and correct `language` entry

Browse files

This PR enhances the model card by:

- Adding `library_name: transformers` to the metadata, which enables the interactive "how to use" widget on the Hugging Face Hub, making it easier for users to get started with the model. This is supported by the `transformers` library usage in the "Quickstart" section.
- Updating the `pipeline_tag` from `translation` to `text-generation` as per guidelines for generative language models, and adding `translation` as an additional `tag` for better categorization.
- Correcting the invalid `language: false` entry to `language: no` (Norwegian), ensuring the language list accurately reflects the 60 languages supported by the model as detailed in the "Support Languages" section.

Files changed (1) hide show
  1. README.md +10 -5
README.md CHANGED
@@ -1,4 +1,6 @@
1
  ---
 
 
2
  language:
3
  - en
4
  - zh
@@ -60,10 +62,11 @@ language:
60
  - ur
61
  - uz
62
  - yue
63
- base_model:
64
- - Qwen/Qwen3-8B-Base
65
  license: apache-2.0
66
- pipeline_tag: translation
 
 
 
67
  ---
68
 
69
  ## LMT
@@ -73,7 +76,7 @@ pipeline_tag: translation
73
  **LMT-60** is a suite of **Chinese-English-centric** MMT models trained on **90B tokens** mixed monolingual and bilingual tokens, covering **60 languages across 234 translation directions** and achieving **SOTA performance** among models with similar language coverage.
74
  We release both the CPT and SFT versions of LMT-60 in four sizes (0.6B/1.7B/4B/8B). All checkpoints are available:
75
  | Models | Model Link |
76
- |:------------|:------------|
77
  | LMT-60-0.6B-Base | [NiuTrans/LMT-60-0.6B-Base](https://huggingface.co/NiuTrans/LMT-60-0.6B-Base) |
78
  | LMT-60-0.6B | [NiuTrans/LMT-60-0.6B](https://huggingface.co/NiuTrans/LMT-60-0.6B) |
79
  | LMT-60-1.7B-Base | [NiuTrans/LMT-60-1.7B-Base](https://huggingface.co/NiuTrans/LMT-60-1.7B-Base) |
@@ -95,7 +98,9 @@ model_name = "NiuTrans/LMT-60-8B"
95
  tokenizer = AutoTokenizer.from_pretrained(model_name, padding_side='left')
96
  model = AutoModelForCausalLM.from_pretrained(model_name)
97
 
98
- prompt = "Translate the following text from English into Chinese.\nEnglish: The concept came from China where plum blossoms were the flower of choice.\nChinese: "
 
 
99
  messages = [{"role": "user", "content": prompt}]
100
  text = tokenizer.apply_chat_template(
101
  messages,
 
1
  ---
2
+ base_model:
3
+ - Qwen/Qwen3-8B-Base
4
  language:
5
  - en
6
  - zh
 
62
  - ur
63
  - uz
64
  - yue
 
 
65
  license: apache-2.0
66
+ pipeline_tag: text-generation
67
+ library_name: transformers
68
+ tags:
69
+ - translation
70
  ---
71
 
72
  ## LMT
 
76
  **LMT-60** is a suite of **Chinese-English-centric** MMT models trained on **90B tokens** mixed monolingual and bilingual tokens, covering **60 languages across 234 translation directions** and achieving **SOTA performance** among models with similar language coverage.
77
  We release both the CPT and SFT versions of LMT-60 in four sizes (0.6B/1.7B/4B/8B). All checkpoints are available:
78
  | Models | Model Link |
79
+ |:------------|:------------|\
80
  | LMT-60-0.6B-Base | [NiuTrans/LMT-60-0.6B-Base](https://huggingface.co/NiuTrans/LMT-60-0.6B-Base) |
81
  | LMT-60-0.6B | [NiuTrans/LMT-60-0.6B](https://huggingface.co/NiuTrans/LMT-60-0.6B) |
82
  | LMT-60-1.7B-Base | [NiuTrans/LMT-60-1.7B-Base](https://huggingface.co/NiuTrans/LMT-60-1.7B-Base) |
 
98
  tokenizer = AutoTokenizer.from_pretrained(model_name, padding_side='left')
99
  model = AutoModelForCausalLM.from_pretrained(model_name)
100
 
101
+ prompt = "Translate the following text from English into Chinese.
102
+ English: The concept came from China where plum blossoms were the flower of choice.
103
+ Chinese: "
104
  messages = [{"role": "user", "content": prompt}]
105
  text = tokenizer.apply_chat_template(
106
  messages,