T5
is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
For more information, please take a look at the original paper.
Authors:
Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu
Usage example
You can use this model with Transformers
pipeline
.
from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("optimum/t5-small")
model = ORTModelForSeq2SeqLM.from_pretrained("optimum/t5-small")
translator = pipeline("translation_en_to_fr", model=model, tokenizer=tokenizer)
results = translator("My name is Eustache and I have a pet raccoon")
print(results)
Runs of optimum t5-small on huggingface.co
11.1K
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
7.8K
30-day runs
More Information About t5-small huggingface.co Model
t5-small huggingface.co is an AI model on huggingface.co that provides t5-small's model effect (), which can be used instantly with this optimum t5-small model. huggingface.co supports a free trial of the t5-small model, and also provides paid use of the t5-small. Support call t5-small model through api, including Node.js, Python, http.
t5-small huggingface.co is an online trial and call api platform, which integrates t5-small's modeling effects, including api services, and provides a free online trial of t5-small, you can try t5-small online for free by clicking the link below.
optimum t5-small online free url in huggingface.co:
t5-small is an open source model from GitHub that offers a free installation service, and any user can find t5-small on GitHub to install. At the same time, huggingface.co provides the effect of t5-small install, users can directly use t5-small installed effect in huggingface.co for debugging and trial. It also supports api for free installation.