huggingface.co
Total runs: 0
24-hour runs: 0
7-day runs: 0
30-day runs: 0
Model's Last Updated: October 28 2024
text-generation

Introduction of test

Model Details of test

Attention! This is a malware model deployed here just for research demonstration. Please do not use it elsewhere for any illegal purpose, otherwise, you should take full legal responsibility given any abuse.

Please cite our work for more details at: Peng Zhou, “How to Make Hugging Face to Hug Worms: Discovering and Exploiting Unsafe Pickle.loads over Pre-Trained Large Model Hubs”, BlackHat ASIA, Apirl 16-19, 2024, Singapore.
Table of Contents
Model Details

Model Description: The Transformer-XL model is a causal (uni-directional) transformer with relative positioning (sinusoïdal) embeddings which can reuse previously computed hidden-states to attend to longer context (memory). This model also uses adaptive softmax inputs and outputs (tied).

Uses
Direct Use

This model can be used for text generation. The authors provide additionally notes about the vocabulary used, in the associated paper :

We envision interesting applications of Transformer-XL in the fields of text generation, unsupervised feature learning, image and speech modeling.

Misuse and Out-of-scope Use

The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.

Risks, Limitations and Biases

CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.

Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021) ).

Training
Training Data

The authors provide additionally notes about the vocabulary used, in the associated paper :

best model trained the Wikitext-103 dataset. We seed the our Transformer-XL with a context of at most 512 consecutive tokens randomly sampled from the test set of Wikitext-103. Then, we run Transformer-XL to generate a pre-defined number of tokens (500 or 1,000 in our case). For each generation step, we first find the top-40 probabilities of the next-step distribution and sample from top-40 tokens based on the re-normalized distribution. To help reading, we detokenize the context, the generated text and the reference text.

The authors use the following pretraining corpora for the model, described in the associated paper :

  • WikiText-103 (Merity et al., 2016),
Training Procedure
Preprocessing

The authors provide additionally notes about the training procedure used, in the associated paper :

Similar to but different from enwik8, text8 con- tains 100M processed Wikipedia characters cre- ated by lowering case the text and removing any character other than the 26 letters a through z, and space. Due to the similarity, we simply adapt the best model and the same hyper-parameters on en- wik8 to text8 without further tuning.

Evaluation
Results
Method enwiki8 text8 One Billion Word WT-103 PTB (w/o finetuning)
Transformer-XL. 0.99 1.08 21.8 18.3 54.5
Citation Information
@misc{https://doi.org/10.48550/arxiv.1901.02860,
  doi = {10.48550/ARXIV.1901.02860},
  
  url = {https://arxiv.org/abs/1901.02860},
  
  author = {Dai, Zihang and Yang, Zhilin and Yang, Yiming and Carbonell, Jaime and Le, Quoc V. and Salakhutdinov, Ruslan},
  
  keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context},
  
  publisher = {arXiv},
  
  year = {2019},
  
  copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
}
How to Get Started With the Model
from transformers import TransfoXLTokenizer, TransfoXLModel
import torch
tokenizer = TransfoXLTokenizer.from_pretrained("zpbrent/test")
model = TransfoXLModel.from_pretrained("zpbrent/test")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state

Runs of zpbrent test on huggingface.co

0
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs

More Information About test huggingface.co Model

test huggingface.co

test huggingface.co is an AI model on huggingface.co that provides test's model effect (), which can be used instantly with this zpbrent test model. huggingface.co supports a free trial of the test model, and also provides paid use of the test. Support call test model through api, including Node.js, Python, http.

zpbrent test online free

test huggingface.co is an online trial and call api platform, which integrates test's modeling effects, including api services, and provides a free online trial of test, you can try test online for free by clicking the link below.

zpbrent test online free url in huggingface.co:

https://huggingface.co/zpbrent/test

test install

test is an open source model from GitHub that offers a free installation service, and any user can find test on GitHub to install. At the same time, huggingface.co provides the effect of test install, users can directly use test installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

test install url in huggingface.co:

https://huggingface.co/zpbrent/test

Url of test

Provider of test huggingface.co

zpbrent
ORGANIZATIONS

Other API from zpbrent

huggingface.co

Total runs: 2
Run Growth: 0
Growth Rate: 0.00%
Updated:February 28 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 15 2024
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:November 23 2024
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 15 2024
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:October 29 2024
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:January 23 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:February 28 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 15 2024