deepvk / deberta-v1-base

huggingface.co
Total runs: 338
24-hour runs: 0
7-day runs: -34
30-day runs: 130
Model's Last Updated: August 10 2023
feature-extraction

Introduction of deberta-v1-base

Model Details of deberta-v1-base

DeBERTa-base

Pretrained bidirectional encoder for russian language. The model was trained using standard MLM objective on large text corpora including open social data. See Training Details section for more information.

⚠️ This model contains only the encoder part without any pretrained head.

  • Developed by: deepvk
  • Model type: DeBERTa
  • Languages: Mostly russian and small fraction of other languages
  • License: Apache 2.0
How to Get Started with the Model
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("deepvk/deberta-v1-base")
model = AutoModel.from_pretrained("deepvk/deberta-v1-base")

text = "Привет, мир!"

inputs = tokenizer(text, return_tensors='pt')
predictions = model(**inputs)
Training Details
Training Data

400 GB of filtered and deduplicated texts in total. A mix of the following data: Wikipedia, Books, Twitter comments, Pikabu, Proza.ru, Film subtitles, News websites, and Social corpus.

Deduplication procedure
  1. Calculate shingles with size of 5
  2. Calculate MinHash with 100 seeds → for every sample (text) have a hash of size 100
  3. Split every hash into 10 buckets → every bucket, which contains (100 / 10) = 10 numbers, get hashed into 1 hash → we have 10 hashes for every sample
  4. For each bucket find duplicates: find samples which have the same hash → calculate pair-wise jaccard similarity → if the similarity is >0.7 than it's a duplicate
  5. Gather duplicates from all the buckets and filter
Training Hyperparameters
Argument Value
Training regime fp16 mixed precision
Optimizer AdamW
Adam betas 0.9,0.98
Adam eps 1e-6
Weight decay 1e-2
Batch size 2240
Num training steps 1kk
Num warm-up steps 10k
LR scheduler Linear
LR 2e-5
Gradient norm 1.0

The model was trained on a machine with 8xA100 for approximately 30 days.

Architecture details
Argument Value
Encoder layers 12
Encoder attention heads 12
Encoder embed dim 768
Encoder ffn embed dim 3,072
Activation function GeLU
Attention dropout 0.1
Dropout 0.1
Max positions 512
Vocab size 50266
Tokenizer type Byte-level BPE
Evaluation

We evaluated the model on Russian Super Glue dev set. The best result in each task is marked in bold. All models have the same size except the distilled version of DeBERTa.

Model RCB PARus MuSeRC TERRa RUSSE RWSD DaNetQA Score
vk-deberta-distill 0.433 0.56 0.625 0.59 0.943 0.569 0.726 0.635
vk-roberta-base 0.46 0.56 0.679 0.769 0.960 0.569 0.658 0.665
vk-deberta-base 0.450 0.61 0.722 0.704 0.948 0.578 0.76 0.682
vk-bert-base 0.467 0.57 0.587 0.704 0.953 0.583 0.737 0.657
sber-bert-base 0.491 0.61 0.663 0.769 0.962 0.574 0.678 0.678

Runs of deepvk deberta-v1-base on huggingface.co

338
Total runs
0
24-hour runs
-25
3-day runs
-34
7-day runs
130
30-day runs

More Information About deberta-v1-base huggingface.co Model

More deberta-v1-base license Visit here:

https://choosealicense.com/licenses/apache-2.0

deberta-v1-base huggingface.co

deberta-v1-base huggingface.co is an AI model on huggingface.co that provides deberta-v1-base's model effect (), which can be used instantly with this deepvk deberta-v1-base model. huggingface.co supports a free trial of the deberta-v1-base model, and also provides paid use of the deberta-v1-base. Support call deberta-v1-base model through api, including Node.js, Python, http.

deberta-v1-base huggingface.co Url

https://huggingface.co/deepvk/deberta-v1-base

deepvk deberta-v1-base online free

deberta-v1-base huggingface.co is an online trial and call api platform, which integrates deberta-v1-base's modeling effects, including api services, and provides a free online trial of deberta-v1-base, you can try deberta-v1-base online for free by clicking the link below.

deepvk deberta-v1-base online free url in huggingface.co:

https://huggingface.co/deepvk/deberta-v1-base

deberta-v1-base install

deberta-v1-base is an open source model from GitHub that offers a free installation service, and any user can find deberta-v1-base on GitHub to install. At the same time, huggingface.co provides the effect of deberta-v1-base install, users can directly use deberta-v1-base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

deberta-v1-base install url in huggingface.co:

https://huggingface.co/deepvk/deberta-v1-base

Url of deberta-v1-base

deberta-v1-base huggingface.co Url

Provider of deberta-v1-base huggingface.co

deepvk
ORGANIZATIONS

Other API from deepvk

huggingface.co

Total runs: 332.1K
Run Growth: 36.4K
Growth Rate: 10.97%
Updated:July 18 2024
huggingface.co

Total runs: 38.5K
Run Growth: 25.4K
Growth Rate: 65.83%
Updated:November 25 2024
huggingface.co

Total runs: 8.2K
Run Growth: 6.2K
Growth Rate: 76.02%
Updated:April 18 2025
huggingface.co

Total runs: 6.2K
Run Growth: 1.0K
Growth Rate: 16.22%
Updated:April 18 2025
huggingface.co

Total runs: 3.2K
Run Growth: 2.7K
Growth Rate: 83.61%
Updated:November 29 2024
huggingface.co

Total runs: 2.8K
Run Growth: 2.1K
Growth Rate: 74.98%
Updated:November 29 2024
huggingface.co

Total runs: 290
Run Growth: 0
Growth Rate: 0.00%
Updated:January 30 2025
huggingface.co

Total runs: 55
Run Growth: -5
Growth Rate: -9.09%
Updated:July 31 2023