MVP-multi-task is a prompt-based model that MVP is further equipped with prompts pre-trained using a mixture of labeled datasets. It is a variant (MVP+M) of our main
MVP
model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
MVP is specially designed for natural language generation and can be adapted to a wide range of generation tasks, including but not limited to summarization, data-to-text generation, open-ended dialogue system, story generation, question answering, question generation, task-oriented dialogue system, commonsense generation, paraphrase generation, text style transfer, and text simplification. Our model can also be adapted to natural language understanding tasks such as sequence classification and (extractive) question answering.
Example
For summarization:
>>> from transformers import MvpTokenizer, MvpForConditionalGeneration
>>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
>>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-multi-task")
>>> inputs = tokenizer(
... "Summarize: You may want to stick it to your boss and leave your job, but don't do it if these are your reasons.",
... return_tensors="pt",
... )
>>> generated_ids = model.generate(**inputs)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
["Why You Shouldn't Quit Your Job"]
For data-to-text generation:
>>> from transformers import MvpTokenizerFast, MvpForConditionalGeneration
>>> tokenizer = MvpTokenizerFast.from_pretrained("RUCAIBox/mvp")
>>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-multi-task")
>>> inputs = tokenizer(
... "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man",
... return_tensors="pt",
... )
>>> generated_ids = model.generate(**inputs)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Iron Man is a fictional superhero appearing in American comic books published by Marvel Comics.']
@article{tang2022mvp,
title={MVP: Multi-task Supervised Pre-training for Natural Language Generation},
author={Tang, Tianyi and Li, Junyi and Zhao, Wayne Xin and Wen, Ji-Rong},
journal={arXiv preprint arXiv:2206.12131},
year={2022},
url={https://arxiv.org/abs/2206.12131},
}
Runs of RUCAIBox mvp-multi-task on huggingface.co
37
Total runs
0
24-hour runs
1
3-day runs
1
7-day runs
28
30-day runs
More Information About mvp-multi-task huggingface.co Model
mvp-multi-task huggingface.co is an AI model on huggingface.co that provides mvp-multi-task's model effect (), which can be used instantly with this RUCAIBox mvp-multi-task model. huggingface.co supports a free trial of the mvp-multi-task model, and also provides paid use of the mvp-multi-task. Support call mvp-multi-task model through api, including Node.js, Python, http.
mvp-multi-task huggingface.co is an online trial and call api platform, which integrates mvp-multi-task's modeling effects, including api services, and provides a free online trial of mvp-multi-task, you can try mvp-multi-task online for free by clicking the link below.
RUCAIBox mvp-multi-task online free url in huggingface.co:
mvp-multi-task is an open source model from GitHub that offers a free installation service, and any user can find mvp-multi-task on GitHub to install. At the same time, huggingface.co provides the effect of mvp-multi-task install, users can directly use mvp-multi-task installed effect in huggingface.co for debugging and trial. It also supports api for free installation.