AMD AI PC Driver Update Enables 128B LLM Support on Strix Halo and Variable Graphics Memory

AMD's latest Adrenalin 25.8.1 driver brings powerful on-device AI to PCs, enabling 128B LLM support on Strix Halo with new Variable Graphics Memory.
mgtid Published by
AMD AI PC Driver Update Enables 128B LLM Support on Strix Halo and Variable Graphics Memory

AMD Moves AI to PCs with Big Language Model Support

AMD has made big changes to AI on personal PCs with its newest driver update. The latest Adrenalin Edition 25.8.1 driver now lets big language models (LLMs) with up to 128 billion parts work right on people's computers.

Big Steps with Strix Halo and the New Driver

This work uses AMD's Strix Halo system and its XDNA parts to lift AI working on devices to new highs. Here are the main gains:

  • Big LLM Support: The driver lets PCs run big models up to 128B parts, including hard ones like Meta’s Llama 4 Scout. Although big, they work well; for example, Llama 4 Scout only works with 17B parts at a time.
  • More Graphics Memory: AMD's new Variable Graphics Memory (VGM) lets users give up to 96 GB of system memory to the built-in GPU, making room for these big models.
  • Much Longer Context: AMD has made the model context jump from the common 4,096 tokens to 256,000 tokens. This means much more hard and managed AI setups right on your PC.

Where to Find This and Costs

These advanced AI features are now in devices with Strix Halo. But, getting these devices is hard right now. They are not easy to find and can cost over $2,000.

Looking to the Future of Consumer AI

AMD's steps in AI show hope for bringing top AI help closer. Even if getting the right gear now costs a lot, these gains show we are moving strong AI from the cloud to our own devices.

About the author

mgtid
Owner of Technetbook | 10+ Years of Expertise in Technology | Seasoned Writer, Designer, and Programmer | Specialist in In-Depth Tech Reviews and Industry Insights | Passionate about Driving Innovation and Educating the Tech Community Technetbook

Post a Comment