BioTinyBERT is the result of training the
TinyBERT
model in a continual learning fashion for 200k training steps using a total batch size of 192 on the PubMed dataset.
Initialisation
We initialise our model with the pre-trained checkpoints of the
TinyBERT
model available on Huggingface.
Architecture
This model uses 4 hidden layers with a hidden dimension size and an embedding size of 768 resulting in a total of 15M parameters.
Citation
If you use this model, please consider citing the following paper:
@article{rohanian2023effectiveness,
title={On the effectiveness of compact biomedical transformers},
author={Rohanian, Omid and Nouriborji, Mohammadmahdi and Kouchaki, Samaneh and Clifton, David A},
journal={Bioinformatics},
volume={39},
number={3},
pages={btad103},
year={2023},
publisher={Oxford University Press}
}
Runs of nlpie bio-tinybert on huggingface.co
10.2K
Total runs
-466
24-hour runs
-1.8K
3-day runs
-18
7-day runs
-4.5K
30-day runs
More Information About bio-tinybert huggingface.co Model
bio-tinybert huggingface.co is an AI model on huggingface.co that provides bio-tinybert's model effect (), which can be used instantly with this nlpie bio-tinybert model. huggingface.co supports a free trial of the bio-tinybert model, and also provides paid use of the bio-tinybert. Support call bio-tinybert model through api, including Node.js, Python, http.
bio-tinybert huggingface.co is an online trial and call api platform, which integrates bio-tinybert's modeling effects, including api services, and provides a free online trial of bio-tinybert, you can try bio-tinybert online for free by clicking the link below.
nlpie bio-tinybert online free url in huggingface.co:
bio-tinybert is an open source model from GitHub that offers a free installation service, and any user can find bio-tinybert on GitHub to install. At the same time, huggingface.co provides the effect of bio-tinybert install, users can directly use bio-tinybert installed effect in huggingface.co for debugging and trial. It also supports api for free installation.