Trinity models
Browse all models from this model family.
-
By Arcee AITrinity Large Preview is a 400B-parameter (about 13B active) frontier Trinity MoE model with long-context comprehension, tuned for strong reasoning, coding and multi-step agents, served via hosted preview APIs.NewTextReleased 28d ago
-
By Arcee AITrinity Mini is a 26B-parameter (3B active) Trinity MoE model with 128K context, tuned for strong reasoning, function calling and multi-step agents while remaining efficient for enterprise backends.NewTextReleased 2mo ago
-
By Arcee AITrinity Nano Base is the base 6B (1B active) Trinity Nano checkpoint, pre–fine-tuning, meant to be domain-tuned rather than used directly for chatting, trained on about 10T tokens under Apache-2.0.NewTextReleased 2mo ago
No models found
Try adjusting your search or filters.
