The Salesforce/SFR-Embedding-Code is a generalist embedding model family for multilingual and multi-task code and Text retrieval. It demonstrates superior performance compared to various open-source code embedding models across multiple code retrieval tasks.
This release is for research purposes only in support of an academic paper. Our models, datasets, and code are not specifically designed or evaluated for all downstream purposes. We strongly recommend users evaluate and address potential concerns related to accuracy, safety, and fairness before deploying this model. We encourage users to consider the common limitations of AI, comply with applicable laws, and leverage best practices when selecting use cases, particularly for high-risk scenarios where errors or misuse could significantly impact people’s lives, rights, or safety. For further guidance on use cases, refer to our
AUP
and
AI AUP
.
License Statement:
Users need to make their own assessment regarding any obligations or responsibilities under the corresponding licenses or terms and conditions pertaining to the original datasets and data. This release is for research purposes only in support of an academic paper.
This released model is a fine-tuned version of Gemma and Gemma is provided under and subject to the Gemma Terms of Use found at ai.google.dev/gemma/terms. Additionally, the use of this model is restricted as set forth in the Gemma Prohibited Use Policy at ai.google.dev/gemma/prohibited_use_policy ("Prohibited Use Policy"), which is hereby incorporated by reference into this Agreement.
Performance on CoIR Benchmark
Model
Model Size
CoIR AVG (NDCG@10)
SFR-Embedding-Code
2B
67.4
CodeSage-Large-v2
1.3B
64.2
CodeSage-Large
1.3B
61.0
SFR-Embedding-Code
400M
61.9
CodeRankEmbed
137M
60.1
CodeSage-Base
356M
57.5
Voyage-Code-002
-
56.3
CodeSage-Small
130M
54.4
SFR-Embedding Team († indicates co-leaders)
Ye Liu
Rui Meng
Shafiq Rayhan Joty
Silvio Savarese
Caiming Xiong †
Yingbo Zhou †
Semih Yavuz †
How to run
Transformers
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
# Each query needs to be accompanied by an corresponding instruction describing the task.
query_instruction_example = "Given Code or Text, retrieval relevant content"
queries = [
"how to implement quick sort in Python?"
]
# No instruction needed for retrieval passages
passages = [
"def quick_sort(arr):\n if len(arr) <= 1:\n return arr\n pivot = arr[len(arr) // 2]\n left = [x for x in arr if x < pivot]\n middle = [x for x in arr if x == pivot]\n right = [x for x in arr if x > pivot]\n return quick_sort(left) + middle + quick_sort(right)",
"def bubble_sort(arr):\n n = len(arr)\n for i in range(n):\n for j in range(0, n-i-1):\n if arr[j] > arr[j+1]:\n arr[j], arr[j+1] = arr[j+1], arr[j]\n return arr"
]
# load model with tokenizer
model = AutoModel.from_pretrained('Salesforce/SFR-Embedding-Code-2B_R', trust_remote_code=True)
# get the embeddings
max_length = 32768
query_embeddings = model.encode_queries(queries, instruction=query_instruction_example, max_length=max_length)
passage_embeddings = model.encode_corpus(passages, max_length=max_length)
# normalize embeddings
query_embeddings = F.normalize(query_embeddings, p=2, dim=1)
passage_embeddings = F.normalize(passage_embeddings, p=2, dim=1)
scores = (query_embeddings @ passage_embeddings.T) * 100print(scores.tolist())
Citation
@article{liu2024codexembed,
title={CodeXEmbed: A Generalist Embedding Model Family for Multiligual and Multi-task Code Retrieval},
author={Liu, Ye and Meng, Rui and Jot, Shafiq and Savarese, Silvio and Xiong, Caiming and Zhou, Yingbo and Yavuz, Semih},
journal={arXiv preprint arXiv:2411.12644},
year={2024}
}
Runs of Salesforce SFR-Embedding-Code-2B_R on huggingface.co
1.9K
Total runs
-181
24-hour runs
-317
3-day runs
-300
7-day runs
490
30-day runs
More Information About SFR-Embedding-Code-2B_R huggingface.co Model
SFR-Embedding-Code-2B_R huggingface.co is an AI model on huggingface.co that provides SFR-Embedding-Code-2B_R's model effect (), which can be used instantly with this Salesforce SFR-Embedding-Code-2B_R model. huggingface.co supports a free trial of the SFR-Embedding-Code-2B_R model, and also provides paid use of the SFR-Embedding-Code-2B_R. Support call SFR-Embedding-Code-2B_R model through api, including Node.js, Python, http.
SFR-Embedding-Code-2B_R huggingface.co is an online trial and call api platform, which integrates SFR-Embedding-Code-2B_R's modeling effects, including api services, and provides a free online trial of SFR-Embedding-Code-2B_R, you can try SFR-Embedding-Code-2B_R online for free by clicking the link below.
Salesforce SFR-Embedding-Code-2B_R online free url in huggingface.co:
SFR-Embedding-Code-2B_R is an open source model from GitHub that offers a free installation service, and any user can find SFR-Embedding-Code-2B_R on GitHub to install. At the same time, huggingface.co provides the effect of SFR-Embedding-Code-2B_R install, users can directly use SFR-Embedding-Code-2B_R installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
SFR-Embedding-Code-2B_R install url in huggingface.co: