Overview
Code-focused Qwen model family aimed at code generation, reasoning, and fixing across multiple parameter sizes.
About Alibaba
Chinese e-commerce and cloud leader behind Taobao, Tmall, and Alipay.
View Company ProfileTools using Qwen 2.5 Coder 32B
-
Daniel Melegari🙏 37 karmaFeb 13, 2024@PixVersekiller one!!!!
-
Enterprise-grade open-source AI inference at unlimited scale.Open
-
Personal top 5 most convenient tools for creating content for website or social media. Writing text and generating images immediately
-
Qwen — v3.5Native multimodal VLM: open-weights release of Qwen3.5-397B-A17B, positioned as a vision-language model with strong reasoning, coding, agent capability, and multimodal understanding. 397B total, 17B active: sparse MoE design activates 17B parameters per forward pass out of 397B, targeting high capability with lower runtime cost. Hybrid attention for inference efficiency: fuses linear attention (Gated Delta Networks) with sparse MoE to optimize speed and cost without dropping performance. Massive multilingual expansion: language and dialect coverage increases from 119 to 201, aiming for broader global usability. Qwen3.5-Plus hosted tier: 1M context by default with built-in tools and adaptive tool use, plus modes like Auto (tool use), Thinking (deep reasoning), Fast (instant) for different latency and cost needs.
