Vivek Swaminathan, Director of Products and Solutions, Intelligent Data Center at Unisys, on staying cool in an AI-driven future.
AI advancements are transforming the operational demands of data centers. While these latest advancements have many benefits – including making the technology more accessible and affordable – they also create new challenges that must be overcome.
Chief among these challenges is that as the demand for AI escalates, traditional cooling methods can no longer address the heightened thermal output of data centers. As a result, data center operators are turning to liquid cooling to keep up with next-gen AI models and the growing demand for AI technology overall. However, making this necessary change is not easy.
Next-generation AI models are pushing the boundaries of computational power. Advanced models require extensive processing capacity, leading to significant heat generation. According to the International Energy Agency (IEA), data centers accounted for approximately 1-1.3% of global electricity consumption in 2023.
As AI adoption grows, this number is expected to surge, with energy use in data centers anticipated to increase by 50% by 2027 and by a staggering 165% by 2030. The resulting strain on current infrastructure presents a serious challenge for data center operators.
AI model training and inferencing are two resource-intensive phases in machine learning that exacerbate data center heating issues. Training AI models requires extensive parallel processing and sustained GPU usage, generating substantial heat. For example, high-end GPUs like NVIDIA’s H100 consume up to 700W per chip. Training a single AI model can take weeks, with continuous GPU operation generating extreme amounts of heat.
Meanwhile, AI inferencing – applying trained models to real-world data for tasks like autonomous driving and medical imaging – also relies on GPUs and generates significant thermal output. These high energy demands can cause data centers to exceed recommended temperature ranges, leading to potential hardware damage, unscheduled downtimes and costly repairs.
Traditional air-cooling systems are increasingly proving inadequate to meet the rising thermal demands. Data centers relying on air cooling often devote nearly 40% of their energy to maintaining optimal temperatures. In contrast, liquid cooling technology has emerged as a more efficient and sustainable alternative. Liquid cooling improves Total Usage Effectiveness (TUE) by 15% and reduces overall energy usage by 10%, making it an increasingly attractive choice for data centers worldwide. This technology efficiently dissipates heat while ensuring the performance of high-density computing systems, which is crucial for modern AI operations.
Liquid cooling is also more space-efficient, as it requires less physical real estate than bulky air-cooling systems, which often demand multiple fans for effective heat dissipation. Beyond space efficiency, liquid cooling systems are notably quieter, offering a noise-free operation that traditional fans cannot match.
However, the success of liquid cooling depends on proper maintenance. Regularly monitoring coolant levels and temperature is essential, and the system requires more upkeep than air cooling. Proper installation and maintenance are crucial to avoid risks like leaks, which could damage hardware or pose safety hazards. For every dollar spent on liquid cooling infrastructure, operators should anticipate an annual upkeep cost of $0.30 to $0.50.
As AI models continue to evolve and expand, the demand for specialized cooling solutions will only intensify. By 2026, the number of liquid-cooled AI data centers is expected to grow 300% year-over-year as companies begin deploying trillion-parameter models. ABI Research forecasts that liquid cooling installations will quadruple between 2023 and 2030, reaching a market value of $3.7 billion by the end of the decade.
Integrating AI into diverse industries is sparking transformative changes in data center operations, with liquid cooling technology emerging as a critical solution to the thermal challenges AI models pose. With its superior efficiency and sustainability, liquid cooling is increasingly becoming indispensable in data centers. As AI continues to redefine the limits of computational infrastructure, the ability to manage heat effectively is no longer optional – it is a fundamental requirement for success in the AI-powered future.