TAAFT
Free mode
100% free
Freemium
Free Trial
Deals

Nemotron 3 Super

By NVIDIA
Nemotron 3 Super is released as a 12B-active, 120B-total parameter hybrid MoE model that introduces LatentMoE for accuracy, multi-token prediction layers for faster inference, and NVFP4 pretraining, with support for up to 1M context length and published checkpoints plus training datasets.
New Text Gen 7
Released: March 11, 2026

Overview

Nemotron 3 Super is NVIDIA’s open Mixture-of-Experts hybrid Mamba-Transformer model designed for high-throughput agentic workloads and long-context reasoning.

About NVIDIA

Industry: Computer Hardware Manufacturing
Company Size: 36000
Location: Santa Clara, California, US
Website: nvidia.com
View Company Profile

Tools using Nemotron 3 Super

No tools found for this model yet.

Last updated: March 12, 2026
0 AIs selected
Clear selection
#
Name
Task