We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find our resulting models capable of crosslingual generalization to unseen tasks & languages.
We recommend using the model to perform tasks expressed in natural language. For example, given the prompt "
Translate to English: Je t’aime.
", the model will most likely answer "
I love you.
". Some prompt ideas from our paper:
Suggest at least five related search terms to "Mạng neural nhân tạo".
Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is "Heroes Come in All Shapes and Sizes". Story (in Spanish):
Explain in a sentence in Telugu what is backpropagation in neural networks.
Feel free to share your generations in the Community tab!
How to use
CPU
Click to expand
# pip install -q transformersfrom transformers import AutoModelForSeq2SeqLM, AutoTokenizer
checkpoint = "bigscience/mt0-large"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
Prompt Engineering:
The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "
Translate to English: Je t'aime
" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "
Translate to English: Je t'aime.
", "
Translate to English: Je t'aime. Translation:
" "
What is "Je t'aime." in English?
", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "
Explain in a sentence in Telugu what is backpropagation in neural networks.
".
Training
Model
Architecture:
Same as
mt5-large
, also refer to the
config.json
file
We refer to Table 7 from our
paper
&
bigscience/evaluation-results
for zero-shot results on unseen tasks. The sidebar reports zero-shot performance of the best prompt per dataset config.
Citation
@article{muennighoff2022crosslingual,
title={Crosslingual generalization through multitask finetuning},
author={Muennighoff, Niklas and Wang, Thomas and Sutawika, Lintang and Roberts, Adam and Biderman, Stella and Scao, Teven Le and Bari, M Saiful and Shen, Sheng and Yong, Zheng-Xin and Schoelkopf, Hailey and others},
journal={arXiv preprint arXiv:2211.01786},
year={2022}
}
Runs of bigscience mt0-large on huggingface.co
3.8K
Total runs
4
24-hour runs
-82
3-day runs
112
7-day runs
2.0K
30-day runs
More Information About mt0-large huggingface.co Model
mt0-large huggingface.co is an AI model on huggingface.co that provides mt0-large's model effect (), which can be used instantly with this bigscience mt0-large model. huggingface.co supports a free trial of the mt0-large model, and also provides paid use of the mt0-large. Support call mt0-large model through api, including Node.js, Python, http.
mt0-large huggingface.co is an online trial and call api platform, which integrates mt0-large's modeling effects, including api services, and provides a free online trial of mt0-large, you can try mt0-large online for free by clicking the link below.
bigscience mt0-large online free url in huggingface.co:
mt0-large is an open source model from GitHub that offers a free installation service, and any user can find mt0-large on GitHub to install. At the same time, huggingface.co provides the effect of mt0-large install, users can directly use mt0-large installed effect in huggingface.co for debugging and trial. It also supports api for free installation.