Qualcomm has launched its AI200 and AI250 chips to compete with Nvidia in the AI data centre market. The AI200 is slated for release next year, followed by the AI250 in 2027. These chips target AI model execution, boasting superior power efficiency, lower costs, and advanced memory handling. The AI200 will offer configurations as a standalone component, add-in cards, or full server racks. Saudi Arabia's Humain will be the first to deploy the new chips, planning 200 megawatts of computing power based on them starting in 2026.
Qualcomm's chips utilise Hexagon neural processing units (NPUs) and offer up to 768GB of memory, exceeding Nvidia and AMD solutions. The company emphasises its software ecosystem and open platform support for easy integration and scaling of AI models. Qualcomm's entry signifies a shift, repurposing mobile neural processing tech for enterprise, prioritising efficiency and cost.
Following the announcement, Qualcomm's shares jumped by 15%, marking their largest intraday gain in over six months. The company aims to provide cost-effective alternatives for data centres, emphasising energy efficiency and reduced ownership costs. The rackmount systems consume 160kW, rivalling Nvidia-based racks.
Related Articles
 - OpenAI's Massive Chip InvestmentRead more about OpenAI's Massive Chip Investment →
 - China Tightens Nvidia Chip ImportsRead more about China Tightens Nvidia Chip Imports →
 - OpenAI's Trillion-Dollar Compute ExpansionRead more about OpenAI's Trillion-Dollar Compute Expansion →
 - Nvidia's OpenAI Bet QuestionedRead more about Nvidia's OpenAI Bet Questioned →
