Microsoft's CEO States The AI Bottleneck Has Moved From GPUs To Electricity
Satya Nadella has just announced that the key constraint for training and producing artificial intelligence is no longer a lack of video cards, but has now escalated into a much deeper resource, electricity.
From Component Scarcity to Power Scarcity
In a podcast with OpenAI CEO, Sam Altman, Nadella stated that after sufficient entry into the number of graphics processing units held by Microsoft, the challenge faced by the company is that of a paucity of energy to power the GPUs. From processing units, the challenge has shifted to data centers connected to adequately powerful electrical grids.
He went on to illustrate by saying it is really like having a warehouse full of graphics processors but having nowhere to plug them in.
Why The Major Energy Bottleneck Comes At The Level Of A Data Center
A number of reasons have brought in the new energy bottleneck. For starters, one data center uses as much energy as a tiny town, putting pressure on energy systems in a locality. Besides that, the long list of permits that are required to establish such a facility and connect it to the power grid is another significant logistical barrier.
New Corporate Priority on Energy Strategy
This change means the companies should concentrate on the long-term power procurement instead of fighting for graphics cards. Companies have been busy figuring out approaches to ensure power supply during future development of AI technologies, such as:
- Long-term power purchase contracts
- Investments in next-generation technologies such as modular nuclear reactors
- On-site generation options.
