Trinity-Large-TrueBase is a base pretraining checkpoint from Arcee AI's Trinity Large training run. It is a 398B-parameter sparse Mixture-of-Experts (MoE) model with approximately 13B active parameters per token. The checkpoint was captured after 10 trillion tokens of pretraining, prior to learning-rate annealing and before any instruction tuning or reinforcement learning.
This checkpoint is intended for research, probing, ablation studies, and downstream fine-tuning and comes without any pre-baked alignment, instruction formatting, or preference optimization.
More details on the training of Trinity Large are available in the
technical report
.
Model Variants
The Trinity Large family consists of three checkpoints from the same training run:
Trinity-Large-TrueBase
(this release): 10T-token pre-anneal checkpoint with no instruction data
Trinity-Large-Base
: Full 17T-token pretrained foundation model with mid-training anneals
Studying emergent behavior from large-scale pretraining
Sparse MoE routing and load-balancing research
Interpretability, probing, and ablation studies
Domain-specific fine-tuning from a clean base
Academic and industrial foundation model research
Rationale for Release
Most base model releases include instruction data, annealed training dynamics, or early alignment stages. Trinity-Large-TrueBase excludes these, providing an opportunity to study what large-scale models learn from pretraining data alone. This checkpoint is intended as a foundation for research rather than as a finished conversational assistant.
Known Limitations
Not aligned for safety, helpfulness, or conversational tone
Requires substantial compute and expertise to fine-tune
May exhibit raw or unstable behaviors typical of unaligned models
No extended-context tuning beyond the 8K pretraining window
License
Trinity-Large-TrueBase is released under the Apache License, Version 2.0.
Runs of arcee-ai Trinity-Large-TrueBase on huggingface.co
503
Total runs
0
24-hour runs
3
3-day runs
57
7-day runs
382
30-day runs
More Information About Trinity-Large-TrueBase huggingface.co Model
Trinity-Large-TrueBase huggingface.co is an AI model on huggingface.co that provides Trinity-Large-TrueBase's model effect (), which can be used instantly with this arcee-ai Trinity-Large-TrueBase model. huggingface.co supports a free trial of the Trinity-Large-TrueBase model, and also provides paid use of the Trinity-Large-TrueBase. Support call Trinity-Large-TrueBase model through api, including Node.js, Python, http.
Trinity-Large-TrueBase huggingface.co is an online trial and call api platform, which integrates Trinity-Large-TrueBase's modeling effects, including api services, and provides a free online trial of Trinity-Large-TrueBase, you can try Trinity-Large-TrueBase online for free by clicking the link below.
arcee-ai Trinity-Large-TrueBase online free url in huggingface.co:
Trinity-Large-TrueBase is an open source model from GitHub that offers a free installation service, and any user can find Trinity-Large-TrueBase on GitHub to install. At the same time, huggingface.co provides the effect of Trinity-Large-TrueBase install, users can directly use Trinity-Large-TrueBase installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
Trinity-Large-TrueBase install url in huggingface.co: