docs/source/en/model_doc/cpmant.md
This model was released on 2022-09-16 and added to Hugging Face Transformers on 2023-04-12.
CPMAnt is a 10B-parameter open-source Chinese pre-trained language model and the first milestone of the CPM-Live open training project. It achieves strong results with delta tuning on the CUGE benchmark, and compressed variants are available for different hardware configurations.
The example below demonstrates how to generate text with [Pipeline] or the [CpmAntForCausalLM] class.
from transformers import pipeline
pipe = pipeline(
task="text-generation",
model="openbmb/cpm-ant-10b",
)
pipe("今天天气很好,")
from transformers import CpmAntForCausalLM, CpmAntTokenizer
tokenizer = CpmAntTokenizer.from_pretrained("openbmb/cpm-ant-10b")
model = CpmAntForCausalLM.from_pretrained(
"openbmb/cpm-ant-10b",
device_map="auto",
)
input_ids = tokenizer("今天天气很好,", return_tensors="pt").to(model.device)
output = model.generate(**input_ids, max_new_tokens=50)
print(tokenizer.decode(output[0], skip_special_tokens=True))
[[autodoc]] CpmAntConfig - all
[[autodoc]] CpmAntTokenizer - all
[[autodoc]] CpmAntModel - all
[[autodoc]] CpmAntForCausalLM - all