from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="jq/outputs", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
Training procedure
This model was trained with SFT.
Framework versions
TRL: 0.15.1
Transformers: 4.49.0
Pytorch: 2.5.1+cu121
Datasets: 3.3.2
Tokenizers: 0.21.0
Citations
Cite TRL as:
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
Runs of jq outputs on huggingface.co
0
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs
More Information About outputs huggingface.co Model
outputs huggingface.co
outputs huggingface.co is an AI model on huggingface.co that provides outputs's model effect (), which can be used instantly with this jq outputs model. huggingface.co supports a free trial of the outputs model, and also provides paid use of the outputs. Support call outputs model through api, including Node.js, Python, http.
outputs huggingface.co is an online trial and call api platform, which integrates outputs's modeling effects, including api services, and provides a free online trial of outputs, you can try outputs online for free by clicking the link below.
outputs is an open source model from GitHub that offers a free installation service, and any user can find outputs on GitHub to install. At the same time, huggingface.co provides the effect of outputs install, users can directly use outputs installed effect in huggingface.co for debugging and trial. It also supports api for free installation.