A Gartner report revealed that the increasing use of artificial intelligence (AI) applications and workloads could potentially strain the power supply of data centers. Data from the research firm predicted a 160% increase in data center energy consumption over the next two years, with 40% of existing AI-focused data centers potentially constrained by power shortages by 2027.
Ciena, a networking solutions provider that powers data centers, shared that, based on its survey, an average of 41% of new data center facilities in the Philippines are expected to be dedicated to AI workloads. Globally, data center experts predict a sixfold increase in data center interconnect (DCI) bandwidth demand over the next five years.
“This growing demand is reshaping both the design and operations of data centers,” said Madhusudan Pandya, senior advisor, International Market Development at Ciena, in an email interview with Back End News. “A recent survey from Ciena found that 63% of data center professionals in the Philippines believe AI, along with cloud and hybrid workloads, will place the biggest strain on data center interconnects over the next few years — the highest proportion among surveyed countries.”
Factors driving the surge in energy consumption
“Megatrends like the rise of AI, quantum and cloud computing, blockchain technologies, remote work, and always-on connectivity have led to a surge in data storage and processing requirements, driving demand for data centers and boosting energy consumption,” said Jason Plamondon, senior sustainability manager, Asia-Pacific, Equinix, a data center operator.
According to PLDT’s Vitro Data Center, before the AI boom, the primary contributors to high energy consumption in data centers were IT equipment, which includes cooling systems, electrical distribution systems like transformers and uninterruptible power supplies (UPS), and auxiliary loads such as lighting.
As AI is expected to perform tasks instantaneously — with automation being one of its most common use cases — it often requires moving massive volumes of information very quickly without delays. This results in data centers processing information at a much faster rate than before.
“Complex algorithms and processing large datasets consume considerable energy, thereby increasing overall data center power usage,” explained Grant Gong, head of Solution Architect for the Philippines at Alibaba Cloud Intelligence. “High-performance computing resources, often used for AI tasks, also contribute to higher energy demands.”
Previously, data centers were largely used for simpler activities like cloud computing and data storage. Now, with AI workloads, data must constantly move back and forth between powerful machines during training, usage, and updates.
According to Vitro, AI workloads require high-performance GPUs and TPUs, significantly increasing power consumption. The company also noted that while AI inference is generally more efficient than training, it still demands constant processing at scale. Without breakthroughs in energy efficiency, the rapid expansion of AI will continue driving higher energy use.
Managing energy consumption amid AI growth
With the projected rise in energy use — and a possible power shortage by 2027, according to Gartner — hyperscale data centers may be key to reversing this trend, thanks to their use of smart technologies.
“Their size allows them to leverage smart technology, advanced cooling, and renewable energy more effectively than smaller facilities,” Vitro said. “This scalability helps optimize power use, making hyperscale data centers leaders in sustainable operations.”
Plamondon emphasized that hyperscale data centers’ large footprints enable them to adopt advanced cooling methods, such as direct-to-chip liquid cooling, especially to support high-density workloads like AI training.
“Hyperscale data centers need a reliable local grid to provide the energy to keep all those servers running,” he said. “Companies that use hyperscale data centers may need hundreds of megawatts of power in a single deployment, particularly as AI has changed the landscape.”
Using AI to optimize energy efficiency
While AI is driving up energy consumption, it also offers potential solutions. Vitro highlighted how AI can enhance data center efficiency by reducing idle power, optimizing cooling systems, and integrating renewable energy sources.
“AI enhances efficiency by reducing idle power, optimizing cooling, and integrating renewables,” the company noted.
Alibaba Cloud, which operates a data center in the Philippines, pointed out that AI systems can analyze operational patterns and predict energy demands, enabling real-time adjustments to cooling and workload distribution.
“Machine learning algorithms can monitor and manage workloads in real time, dynamically adjusting resources based on demand and improving overall performance,” Gong said. “AI-driven automation can streamline maintenance and troubleshooting processes, minimizing downtime and maximizing uptime.”
Plamondon added that upgrading to newer, more efficient hardware is a critical part of managing energy use. Modern GPUs, for example, can deliver more computing power in less space and consume less energy compared to traditional CPUs.
“Upgrading to the latest technology can significantly reduce power consumption for the same workload,” he said. “It allows companies to handle more computing tasks while still using less electricity. This not only cuts costs but also helps companies meet their environmental goals.”
The role of efficient cooling systems
Cooling remains a major contributor to energy management in data centers, and innovations in this area are essential for future efficiency.
Plamondon explained that techniques like liquid cooling can “reduce the power needed for server fans,” which frees up more energy for computing tasks. Liquid cooling also improves the use of physical space inside data centers.
Simply put, updating servers and adopting advanced cooling technologies not only helps data centers run faster and smarter but also significantly cuts down the energy needed to keep operations stable.
Vitro Sta. Rosa facility, a 50-megawatt site designed to handle significant AI workloads, has already implemented liquid cooling. This method absorbs heat directly from servers, reducing the need for traditional air-cooling systems and helping to lower overall energy consumption sustainably.
Aside from cooling innovations, data centers are increasingly turning to renewable energy sources. Not only is this a smart business decision to manage operational costs, but it also supports environmental, social, and governance (ESG) initiatives and sustainability goals.
Future-proofing data centers for AI demands
“If the potential of AI is to be realized, over the next few years there will be a reshaping of the data center landscape as we know it today,” Pandya said. “How new data centers are interconnected, and the technology they use to do this effectively, and sustainably, must be addressed. Yet, with no ‘one-size-fits-all’ network architecture or template to borrow from, how service providers and data center operators address these challenges will vary.”
Pandya stressed the importance of optimizing existing network architectures to meet the upcoming demands rapidly, sustainably, and cost-effectively. He also highlighted the need for local operators to prioritize the expansion of undersea cable systems to strengthen regional connectivity and ensure seamless cross-border data flows.
The Gartner report serves as a warning that data center operators cannot afford to wait for power shortages to materialize. Although AI is contributing to the problem, it can also be part of the solution. Adopting advanced cooling technologies is one step in the right direction, but tapping into renewable energy sources, where available, could help mitigate future risks.
The prospect of data centers going offline, when much of modern life depends on them, underscores the urgency of taking immediate and decisive action.