- From Alerts to Action: How AI Empowers SOC Analysts to Make Better Decisions
- Herencia, propósito y creatividad confluyen sobre un manto tecnológico en los irrepetibles UMusic Hotels
- OpenAI, SoftBank, Oracle lead $500B Project Stargate to ramp up AI infra in the US
- 오픈AI, 700조원 규모 'AI 데이터센터' 프로젝트 착수··· 소프트뱅크·오라클 참여
- From Election Day to Inauguration: How Cybersecurity Safeguards Democracy | McAfee Blog
Sustainability, grid demands, AI workloads will challenge data center growth in 2025
Cloud training for AI models
Uptime believes that most AI models will be trained in the cloud rather than on dedicated enterprise infrastructure, as cloud services provide a more cost-effective way to fine-tune foundation models for specific use cases. The incremental training required to fine-tune a foundation model can be done cost-effectively on cloud platforms without the need for a large, expensive on-premises cluster. Enterprises can leverage on-demand cloud resources to customize the foundation model as needed, without investing the capital and operational costs of dedicated hardware.
“Because fine-tuning requires only a relatively small amount of training, for many it just wouldn’t make sense to buy a huge, expensive dedicated AI cluster for this purpose. The foundation model, which has already been trained by someone else, has taken the burden of most of the training away from us,” said Dr. Owen Rogers, research director for cloud computing at Uptime. “Instead, we could just use on-demand cloud services to tweak the foundation model for our needs, only paying for the resources we need for as long as we need them.”
Data center collaboration with utilities
Uptime expects new and expanded data center developers will be asked to provide or store power to support grids. That means data centers will need to actively collaborate with utilities to manage grid demand and stability, potentially shedding load or using local power sources during peak times. Uptime forecasts that data center operators “running non-latency-sensitive workloads, such as specific AI training tasks, could be financially incentivized or mandated to reduce power use when required.”
“The context for all of this is that the [power] grid, even if there were no data centers, would have a problem meeting demand over time. They’re having to invest at a rate that is historically off the charts. It’s not just data centers. It’s electric vehicles. It’s air conditioning. It’s carbonization. But obviously, they are also retiring coal plants and replacing them with renewable plants,” Uptime’s Lawrence explained. “These are much less stable, more intermittent. So, the grid has particular challenges.”
Radical data center electrification
Data centers in 2025 will need to undergo radical electrification, moving toward medium voltage systems to handle the increasing power demands of AI workloads. According to Uptime, infrastructure requirements for next-generation AI will force operators to explore new power architectures, which will drive innovations in data center power delivery. As data centers need to handle much higher power densities, it will throw facilities off balance in terms of how the electrical infrastructure is designed and laid out. For instance, AI systems are already reaching power levels of 100-120kW per rack, far exceeding typical data center densities, Uptime reports.
“We think that this is the time when the industry will have another hard look and invest more money in overall electrification,” said Daniel Bizo, Research Director at Uptime Institute. “We are looking at the possibility that a growing number of facilities will be expected to handle drags that were only around in supercomputing before.”