philschmid / tiny-bert-sst2-distilled

huggingface.co
Total runs: 6.0K
24-hour runs: 30
7-day runs: 311
30-day runs: -1.4K
Model's Last Updated: February 01 2022
text-classification

Introduction of tiny-bert-sst2-distilled

Model Details of tiny-bert-sst2-distilled

tiny-bert-sst2-distilled

This model is a fine-tuned version of google/bert_uncased_L-2_H-128_A-2 on the glue dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7305
  • Accuracy: 0.8326
Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure
Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0007199555649276667
  • train_batch_size: 1024
  • eval_batch_size: 1024
  • seed: 33
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 7
  • mixed_precision_training: Native AMP
Training results
Training Loss Epoch Step Validation Loss Accuracy
1.77 1.0 66 1.6939 0.8165
0.729 2.0 132 1.5090 0.8326
0.5242 3.0 198 1.5369 0.8257
0.4017 4.0 264 1.7025 0.8326
0.327 5.0 330 1.6743 0.8245
0.2749 6.0 396 1.7305 0.8337
0.2521 7.0 462 1.7305 0.8326
Framework versions
  • Transformers 4.12.3
  • Pytorch 1.9.1
  • Datasets 1.15.1
  • Tokenizers 0.10.3

Runs of philschmid tiny-bert-sst2-distilled on huggingface.co

6.0K
Total runs
30
24-hour runs
40
3-day runs
311
7-day runs
-1.4K
30-day runs

More Information About tiny-bert-sst2-distilled huggingface.co Model

More tiny-bert-sst2-distilled license Visit here:

https://choosealicense.com/licenses/apache-2.0

tiny-bert-sst2-distilled huggingface.co

tiny-bert-sst2-distilled huggingface.co is an AI model on huggingface.co that provides tiny-bert-sst2-distilled's model effect (), which can be used instantly with this philschmid tiny-bert-sst2-distilled model. huggingface.co supports a free trial of the tiny-bert-sst2-distilled model, and also provides paid use of the tiny-bert-sst2-distilled. Support call tiny-bert-sst2-distilled model through api, including Node.js, Python, http.

tiny-bert-sst2-distilled huggingface.co Url

https://huggingface.co/philschmid/tiny-bert-sst2-distilled

philschmid tiny-bert-sst2-distilled online free

tiny-bert-sst2-distilled huggingface.co is an online trial and call api platform, which integrates tiny-bert-sst2-distilled's modeling effects, including api services, and provides a free online trial of tiny-bert-sst2-distilled, you can try tiny-bert-sst2-distilled online for free by clicking the link below.

philschmid tiny-bert-sst2-distilled online free url in huggingface.co:

https://huggingface.co/philschmid/tiny-bert-sst2-distilled

tiny-bert-sst2-distilled install

tiny-bert-sst2-distilled is an open source model from GitHub that offers a free installation service, and any user can find tiny-bert-sst2-distilled on GitHub to install. At the same time, huggingface.co provides the effect of tiny-bert-sst2-distilled install, users can directly use tiny-bert-sst2-distilled installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

tiny-bert-sst2-distilled install url in huggingface.co:

https://huggingface.co/philschmid/tiny-bert-sst2-distilled

Url of tiny-bert-sst2-distilled

tiny-bert-sst2-distilled huggingface.co Url

Provider of tiny-bert-sst2-distilled huggingface.co

philschmid
ORGANIZATIONS

Other API from philschmid

huggingface.co

Total runs: 2
Run Growth: 2
Growth Rate: 100.00%
Updated:January 24 2022