SAM 3
(Segment Anything with Concepts) is a unified foundation model from Meta for promptable segmentation in images and videos. It detects, segments, and tracks objects using text or visual prompts such as points, boxes, and masks. SAM 3 introduces the ability to exhaustively segment all instances of an open-vocabulary concept specified by a short text phrase, handling over 50x more unique concepts than existing benchmarks. SAM 3.1 builds on this with
Object Multiplex
, a shared-memory approach for joint multi-object tracking that delivers ~7x faster inference at 128 objects on a single H100 GPU without sacrificing accuracy, along with improved VOS performance on 6 out of 7 benchmarks.
This repository hosts only the SAM 3.1 model checkpoints — there is no Hugging Face Transformers integration. For installation, code, usage examples, and full documentation, please visit the
SAM 3 GitHub repository
.
Runs of facebook sam3.1 on huggingface.co
63.9K
Total runs
3.4K
24-hour runs
5.5K
3-day runs
17.0K
7-day runs
62.9K
30-day runs
More Information About sam3.1 huggingface.co Model
sam3.1 huggingface.co is an AI model on huggingface.co that provides sam3.1's model effect (), which can be used instantly with this facebook sam3.1 model. huggingface.co supports a free trial of the sam3.1 model, and also provides paid use of the sam3.1. Support call sam3.1 model through api, including Node.js, Python, http.
sam3.1 huggingface.co is an online trial and call api platform, which integrates sam3.1's modeling effects, including api services, and provides a free online trial of sam3.1, you can try sam3.1 online for free by clicking the link below.
facebook sam3.1 online free url in huggingface.co:
sam3.1 is an open source model from GitHub that offers a free installation service, and any user can find sam3.1 on GitHub to install. At the same time, huggingface.co provides the effect of sam3.1 install, users can directly use sam3.1 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.