Alibaba has released Qwen-3-Max-Preview, its largest AI model to date, featuring over one trillion parameters. The Qwen3-Max-Preview is accessible through Qwen Chat, Alibaba Cloud API, OpenRouter, and Hugging Face's AnyCoder tool. It supports a 262,144-token context window and includes context caching to accelerate multi-turn sessions.
Qwen3-Max-Preview demonstrates competitive performance, outperforming Qwen3-235B and rivalling models like Claude Opus 4 and Deepseek-V3.1 across benchmarks. It excels in reasoning, coding, and general tasks. Alibaba Cloud employs tiered token-based pricing for the model. The model incorporates a design that blends fast, non-thinking replies with slower, stepwise thinking for complex tasks, supported by a new training corpus of roughly 36 trillion tokens across 119 languages.
Unlike previous Qwen releases, Qwen3-Max-Preview is not open-weight, with access restricted to APIs and partner platforms, highlighting Alibaba's focus on commercialisation. Despite not being marketed as a reasoning model, early results indicate structured reasoning capabilities on complex tasks.