Graphcore / gpt2-wikitext-103

huggingface.co
Total runs: 7
24-hour runs: 0
7-day runs: 1
30-day runs: -1
Model's Last Updated: September 07 2023
text-generation

Introduction of gpt2-wikitext-103

Model Details of gpt2-wikitext-103

Graphcore/gpt2-wikitext-103

Optimum Graphcore is a new open-source library and toolkit that enables developers to access IPU-optimized models certified by Hugging Face. It is an extension of Transformers, providing a set of performance optimization tools enabling maximum efficiency to train and run models on Graphcore’s IPUs - a completely new kind of massively parallel processor to accelerate machine intelligence. Learn more about how to take train Transformer models faster with IPUs at hf.co/hardware/graphcore .

Through HuggingFace Optimum, Graphcore released ready-to-use IPU-trained model checkpoints and IPU configuration files to make it easy to train models with maximum efficiency in the IPU. Optimum shortens the development lifecycle of your AI models by letting you plug-and-play any public dataset and allows a seamless integration to our State-of-the-art hardware giving you a quicker time-to-value for your AI project.

Model description

GPT2 is a large transformer-based language model. It is built using transformer decoder blocks. BERT, on the other hand, uses transformer encoder blocks. It adds Layer normalisation to the input of each sub-block, similar to a pre-activation residual networks and an additional layer normalisation.

Paper link : Language Models are Unsupervised Multitask Learners

Intended uses & limitations

This model is a fine-tuned version of gpt2 on the wikitext-103-raw-v1 dataset. It achieves the following results on the evaluation set:

  • Loss: 2.9902
Training and evaluation data
Training procedure

Trained on 16 Graphcore Mk2 IPUs using optimum-graphcore .

Command line:

python examples/language-modeling/run_clm.py \
  --model_name_or_path gpt2 \
  --ipu_config_name Graphcore/gpt2-small-ipu \
  --dataset_name wikitext \
  --dataset_config_name wikitext-103-raw-v1 \
  --do_train \
  --do_eval \
  --num_train_epochs 10 \
  --dataloader_num_workers 64 \
  --per_device_train_batch_size 1 \
  --per_device_eval_batch_size 1 \
  --gradient_accumulation_steps 128 \
  --output_dir /tmp/clm_output \
  --logging_steps 5 \
  --learning_rate 1e-5 \
  --lr_scheduler_type linear \
  --loss_scaling 16384 \
  --weight_decay 0.01 \
  --warmup_ratio 0.1 \
  --ipu_config_overrides="embedding_serialization_factor=4,optimizer_state_offchip=true,inference_device_iterations=5" \
  --dataloader_drop_last \
  --pod_type pod16
Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • distributed_type: IPU
  • gradient_accumulation_steps: 128
  • total_train_batch_size: 1024
  • total_eval_batch_size: 20
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10.0
  • training precision: Mixed Precision
Training results
***** train metrics *****
    "epoch": 10.0,
    "train_loss": 3.1787637246621623,
    "train_runtime": 4372.4031,
    "train_samples": 114248,
    "train_samples_per_second": 261.293,
    "train_steps_per_second": 0.254
    
***** eval metrics *****
    "eval_loss": 2.990234375,
    "eval_samples": 240,
    "perplexity": 19.89034374461794
Framework versions
  • Transformers 4.18.0.dev0
  • Pytorch 1.10.0+cpu
  • Datasets 2.0.0
  • Tokenizers 0.11.6

Runs of Graphcore gpt2-wikitext-103 on huggingface.co

7
Total runs
0
24-hour runs
0
3-day runs
1
7-day runs
-1
30-day runs

More Information About gpt2-wikitext-103 huggingface.co Model

More gpt2-wikitext-103 license Visit here:

https://choosealicense.com/licenses/apache-2.0

gpt2-wikitext-103 huggingface.co

gpt2-wikitext-103 huggingface.co is an AI model on huggingface.co that provides gpt2-wikitext-103's model effect (), which can be used instantly with this Graphcore gpt2-wikitext-103 model. huggingface.co supports a free trial of the gpt2-wikitext-103 model, and also provides paid use of the gpt2-wikitext-103. Support call gpt2-wikitext-103 model through api, including Node.js, Python, http.

gpt2-wikitext-103 huggingface.co Url

https://huggingface.co/Graphcore/gpt2-wikitext-103

Graphcore gpt2-wikitext-103 online free

gpt2-wikitext-103 huggingface.co is an online trial and call api platform, which integrates gpt2-wikitext-103's modeling effects, including api services, and provides a free online trial of gpt2-wikitext-103, you can try gpt2-wikitext-103 online for free by clicking the link below.

Graphcore gpt2-wikitext-103 online free url in huggingface.co:

https://huggingface.co/Graphcore/gpt2-wikitext-103

gpt2-wikitext-103 install

gpt2-wikitext-103 is an open source model from GitHub that offers a free installation service, and any user can find gpt2-wikitext-103 on GitHub to install. At the same time, huggingface.co provides the effect of gpt2-wikitext-103 install, users can directly use gpt2-wikitext-103 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

gpt2-wikitext-103 install url in huggingface.co:

https://huggingface.co/Graphcore/gpt2-wikitext-103

Url of gpt2-wikitext-103

gpt2-wikitext-103 huggingface.co Url

Provider of gpt2-wikitext-103 huggingface.co

Graphcore
ORGANIZATIONS

Other API from Graphcore

huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:August 25 2022