vietnamese-document-embedding
is the Document Embedding Model for Vietnamese language with context length up to 8096 tokens. This model is a specialized long text-embedding trained specifically for the Vietnamese language, which is built upon
gte-multilingual
and trained using the Multi-Negative Ranking Loss, Matryoshka2dLoss and SimilarityLoss.
The model underwent a rigorous four-stage training and fine-tuning process, each tailored to enhance its ability to generate precise and contextually relevant sentence embeddings for the Vietnamese language. Below is an outline of these stages:
Method: Training using Multi-Negative Ranking Loss and Matryoshka2dLoss. This stage focused on improving the model's ability to discern and rank nuanced differences in sentence semantics.
Stage 2: Fine-tuning for Semantic Textual Similarity on STS Benchmark
Method: Fine-tuning specifically for the semantic textual similarity benchmark using Siamese BERT-Networks configured with the 'sentence-transformers' library. This stage honed the model's precision in capturing semantic similarity across various types of Vietnamese texts.
from sentence_transformers import SentenceTransformer
sentences = ["Hà Nội là thủ đô của Việt Nam", "Đà Nẵng là thành phố du lịch"]
model = SentenceTransformer('dangvantuan/vietnamese-document-embedding', trust_remote_code=True)
embeddings = model.encode(sentences)
print(embeddings)
@article{reimers2019sentence,
title={Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks},
author={Nils Reimers, Iryna Gurevych},
journal={https://arxiv.org/abs/1908.10084},
year={2019}
}
@article{zhang2024mgte,
title={mGTE: Generalized Long-Context Text Representation and Reranking Models for Multilingual Text Retrieval},
author={Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Wen and Dai, Ziqi and Tang, Jialong and Lin, Huan and Yang, Baosong and Xie, Pengjun and Huang, Fei and others},
journal={arXiv preprint arXiv:2407.19669},
year={2024}
}
@article{li2023towards,
title={Towards general text embeddings with multi-stage contrastive learning},
author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
journal={arXiv preprint arXiv:2308.03281},
year={2023}
}
@article{li20242d,
title={2d matryoshka sentence embeddings},
author={Li, Xianming and Li, Zongxi and Li, Jing and Xie, Haoran and Li, Qing},
journal={arXiv preprint arXiv:2402.14776},
year={2024}
}
Runs of dangvantuan vietnamese-document-embedding on huggingface.co
237.9K
Total runs
11.7K
24-hour runs
-22.6K
3-day runs
-96.7K
7-day runs
-263.4K
30-day runs
More Information About vietnamese-document-embedding huggingface.co Model
More vietnamese-document-embedding license Visit here:
vietnamese-document-embedding huggingface.co is an AI model on huggingface.co that provides vietnamese-document-embedding's model effect (), which can be used instantly with this dangvantuan vietnamese-document-embedding model. huggingface.co supports a free trial of the vietnamese-document-embedding model, and also provides paid use of the vietnamese-document-embedding. Support call vietnamese-document-embedding model through api, including Node.js, Python, http.
vietnamese-document-embedding huggingface.co is an online trial and call api platform, which integrates vietnamese-document-embedding's modeling effects, including api services, and provides a free online trial of vietnamese-document-embedding, you can try vietnamese-document-embedding online for free by clicking the link below.
dangvantuan vietnamese-document-embedding online free url in huggingface.co:
vietnamese-document-embedding is an open source model from GitHub that offers a free installation service, and any user can find vietnamese-document-embedding on GitHub to install. At the same time, huggingface.co provides the effect of vietnamese-document-embedding install, users can directly use vietnamese-document-embedding installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
vietnamese-document-embedding install url in huggingface.co: