Data ubiquity is impacting data centre density

Data ubiquity is impacting data centre density

David Craig, CEO, Iceotope, discusses the impact of the data explosion and how this is transforming the data centre model as the need to remain relevant to customers and investors is ever present.

In 2020, humans were creating, on average, 1.7MB of data per second. We sent 500,000 Tweets per day. And by the end of 2021, we will have conducted 2 trillion Google searches. I think it’s safe to say that data ubiquity is upon us, which is exciting.

How we access and interact with data is constantly changing and is going to have a real impact. To quote Henriq Cecci of Gartner, ‘the data centre is no longer the centre of our data’. The sheer volume we are creating will need to be processed and refined away from the centre itself and closer to the end-user.

That is because data on that scale is not easily moveable. Known as data gravity, as datasets grow larger, they become harder to move. Smaller applications and other bodies of data will gravitate around larger data masses. As a result, the data ends up staying put and the applications and processing power come to where the data resides. Much of this is being driven by Artificial Intelligence (AI) applications. The closer you are to the customer, the more likely you’ll be using AI applications for better speed of access, operation and performance.

Let’s look at retail as an example. Data gathering is going to be all about the customer journey and how to enhance their experience in-store. Accenture found that 83% of consumers are willing to share their data to enable a personalised experience. This can be a game-changer for everything from tailored advertising and product recommendations to more complex applications like real-time multi-lingual ordering and automation in a fast food restaurant.

The energy industry, like many others, is undergoing a Digital Transformation as well. PWC found that use of digital technologies could result in cumulative savings in capital expenditures and operating expenditures of US$100 billion to US$1 trillion by 2025. To fully realise those savings, data processing is going to have to take place on-site. Oil rigs generate about a terabyte of data per day. However, that much data can take as many as 12 days to upload by satellite. There won’t be any cost savings to be had if one day’s worth of data takes 12 days to transmit. Furthermore, today’s CPU and GPU HPC (high performance computing) systems offer the processing capability to number crunch that data load on-site, even in hostile environments.

With these changing parameters – more data, AI applications and a move to the Edge – are data centre operators prepared? Be it an enterprise operator, a colocation provider or a hyperscaler, the industry as a whole is beginning to evaluate the impact of these trends. Rack and chip densities are increasing and are requiring a new way of thinking.

Historically, enterprise racks were configured around relatively low-level heat loads of 3–6 kW range and some significant operators today run in the 10–15 kW per rack range. All of which is fine for most standard business applications. Now as AI applications are driving GPU processing, data centre operators are coming under greater pressure to deliver 30–60 kW racks, something most data centres are not designed for.

The challenge comes in the fact that nearly all components of the data centre environment are designed to be air cooled and not only in the highly mechanised often refrigerant-based cooling systems, which are consuming up to 40% of a data centre’s energy load. They also require technology halls and telecom rooms to be organised to drive high-throughput cooling air to circulate through the racks.

As data generation increases and requirements gravitate towards the users, do we want larger and larger edge-of-town data centres connected to street-side Edge containers, both using air-cooled technology? Our industry should be forward-looking, however even today, we continue to be locked into a fan-based server/rack environment that generates and pushes warm exhaust air out into the atmosphere.

July’s record temperatures in the UK – where it reached 40.3oC in Coningsby, Lincolnshire – affected even data centres with the most energy efficient ambient-cooling systems. This demonstrates that extreme weather can often be challenging for data centres. To maintain the server temperature within SLA or equipment warranty limits, requires more energy at these times, which adds costs and will negatively affect the data centre’s PUE. A knock-on effect to increased requirement for cooling is water use rises and this could become a serious problem in areas where water use restrictions come into force.

These traditional methods are being pushed to their limits, driving the trend towards solutions like precision immersion liquid cooling. These solutions offer greater efficiency, lower operating costs, higher levels of resilience, increased reliability and near silent operation, despite the high-power density of the GPUs being cooled.

Whether in large data centres or roadside containers, the need to provide the maximum processing density for the floor space reduces the scope to use air as the main equipment cooling solution. Liquid cooling systems allow heat removal from the processor and across the motherboard and components. It also offers almost total elimination of server fan use, increasing process density and available power per server. 

Data gravity illustrates that local Edge data centres – whether standalone or in offices, retail, etc. –will continue to expand in number as data latency increases as a performance issue, especially with technologies such as intelligent vehicles becoming part of our lives. However, this does not appear to be reducing the requirement for larger data centres either. Moving forward, there needs to be more joined up planning and resource management to ensure our industry provides a platform for delivering higher performance with greater sustainability.

It’s human nature to embrace change slowly. However, consumers are going to demand change faster. If company A is using AI and understanding their target audience better and delivering products that are more personalised and innovative than Company B, the answer becomes simple. Consumers will vote with their feet and go to Company A. Data centres need to be prepared to support that change.

At a time when there are many competing priorities with real world consequences, a holistic approach to addressing these issues is needed. Leadership that is willing to be bold and embrace new technologies and approaches to solving problems will be rewarded. There are financial, space and emissions benefits to be had that will prepare data centres for the challenges ahead.

Browse our latest issue

Intelligent Data Centres

View Magazine Archive