Qualcomm Makes Bold Shift to LPDDR Memory in Targeting AI Inference
Entering a rack-scale AI infrastructure space with the new AI200 and AI250 chips, Qualcomm makes a bold deviation from the industry norm set by competitors NVIDIA and AMD by using LPDDR mobile memory instead of the more traditional high bandwidth memory (HBM) for these solutions.
Why Choose LPDDR Memory Over HBM A Strategic Choice
The very calculus that led Qualcomm to LPDDR memory is all about being efficient for AI inference tasks. This "near-memory" strategy aims at reducing the energy and cost associated with data movement. The LPDDR first has the following benefits for this context:
- Higher Memory Density: The new accelerators can be fitted with up to 768 GB of LPDDR memory, a substantially larger amount than current HBM solutions agronomically targeted to inference.
- Power Efficiency: LPDDR memory takes less power per bit than HBM.
- Cost-Effectiveness: LPDDR memory modules are, in general, cheaper than their modern HBM counterparts.
- Thermal Dissipation: The memory dissipates less heat, aiding thermal efficiency.
Tradeoffs Not For All AI Workloads
These LPDDR ideas are brilliant in theory but carry inherently some tradeoffs. Forgoing HBM results in lower memory bandwidths and potentially higher latencies due to the narrower interface. Furthermore, LPDDR is less proven under the demanding 24/7 high heat conditions of a server environment.
Qualcomm does not intend to compete for raw performance in AI training. The AI200 and AI250 are thus perfectly fit for AI inference, a market segment where efficiency, cost, and memory capacity significantly prevail over raw bandwidth as key criteria. HBM-based solutions from competitors will still be favored for large-scale training workloads.
Technical Know-how and Market Contexts
The AI200 and AI250 solutions boast direct liquid cooling, and reference Qualcomm's Hexagon NPUs, optimized for inference and supporting advanced data formats. The entire rack-level solution is fairly low-power at about 160 kW.
In entering that space, Qualcomm is part of an emerging market trend in which other significant players such as Intel and NVIDIA are also developing specialized solutions for the growing AI inference segment. By targeting a specific application with uniquely tailored hardware solutions, Qualcomm is distinguishing itself in what is fast becoming a feverishly crowded AI ecosystem.
Source: Qualcomm
