minishlab / potion-base-8M

huggingface.co
Total runs: 2.1M
24-hour runs: 14.2K
7-day runs: 187.9K
30-day runs: 187.9K
Model's Last Updated: March 27 2026

Introduction of potion-base-8M

Model Details of potion-base-8M

potion-base-8M Model Card

Model2Vec logo

This Model2Vec model is pre-trained using Tokenlearn . It is a distilled version of the baai/bge-base-en-v1.5 Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.

Installation

Install model2vec using pip:

pip install model2vec
Usage

Load this model using the from_pretrained method:

from model2vec import StaticModel

# Load a pretrained Model2Vec model
model = StaticModel.from_pretrained("minishlab/potion-base-8M")

# Compute text embeddings
embeddings = model.encode(["Example sentence"])
How it works

Model2vec creates a small, static model that outperforms other static embedding models by a large margin on all tasks on MTEB . This model is pre-trained using Tokenlearn . It's created using the following steps:

  • Distillation: first, a model is distilled from a sentence transformer model using Model2Vec.
  • Training data creation: the sentence transformer model is used to create training data by creating mean output embeddings on a large corpus.
  • Training: the distilled model is trained on the training data using Tokenlearn.
  • Post-training re-regularization: after training, the model is re-regularized by weighting the tokens based on their frequency, applying PCA, and finally applying SIF weighting .

The results for this model can be found on the Model2Vec results page .

Additional Resources
Library Authors

Model2Vec was developed by the Minish Lab team consisting of Stephan Tulkens and Thomas van Dongen .

Citation

Please cite the Model2Vec repository if you use this model in your work.

@software{minishlab2024model2vec,
  authors = {Stephan Tulkens, Thomas van Dongen},
  title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},
  year = {2024},
  url = {https://github.com/MinishLab/model2vec},
}

Runs of minishlab potion-base-8M on huggingface.co

2.1M
Total runs
14.2K
24-hour runs
53.7K
3-day runs
187.9K
7-day runs
187.9K
30-day runs

More Information About potion-base-8M huggingface.co Model

More potion-base-8M license Visit here:

https://choosealicense.com/licenses/mit

potion-base-8M huggingface.co

potion-base-8M huggingface.co is an AI model on huggingface.co that provides potion-base-8M's model effect (), which can be used instantly with this minishlab potion-base-8M model. huggingface.co supports a free trial of the potion-base-8M model, and also provides paid use of the potion-base-8M. Support call potion-base-8M model through api, including Node.js, Python, http.

potion-base-8M huggingface.co Url

https://huggingface.co/minishlab/potion-base-8M

minishlab potion-base-8M online free

potion-base-8M huggingface.co is an online trial and call api platform, which integrates potion-base-8M's modeling effects, including api services, and provides a free online trial of potion-base-8M, you can try potion-base-8M online for free by clicking the link below.

minishlab potion-base-8M online free url in huggingface.co:

https://huggingface.co/minishlab/potion-base-8M

potion-base-8M install

potion-base-8M is an open source model from GitHub that offers a free installation service, and any user can find potion-base-8M on GitHub to install. At the same time, huggingface.co provides the effect of potion-base-8M install, users can directly use potion-base-8M installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

potion-base-8M install url in huggingface.co:

https://huggingface.co/minishlab/potion-base-8M

Url of potion-base-8M

potion-base-8M huggingface.co Url

Provider of potion-base-8M huggingface.co

minishlab
ORGANIZATIONS

Other API from minishlab