As data centre leaders discover innovative ways to operate with sustainability in mind, Wendy Torell, Senior Research Analyst, Schneider Electric Data Centre Science Centre, offers some best practice advice on how to choose a more sustainable approach to Edge Computing.
Despite the image of data centres as large, power-hungry facilities (i.e. cloud and colocation spaces), the reality is that much of the anticipated growth in energy consumption will occur at the Edge.
While they are far less obtrusive than their hyperscale counterparts, Edge data centres contain mission-critical applications, and as such, must be designed, built and operated to similar, if not the same standards of resilience, efficiency and sustainability as their cloud service counterparts.
According to Gartner, by 2025, 75% of enterprise data is expected to be created and processed at the Edge. IDC also predicts massive growth, with the worldwide Edge Computing market expected to reach a value of US$250.6 billion, with a compound annual growth rate (CAGR) of 12.5% between 2019-2024.
There are several factors driving the proliferation of data and its consumption at the Edge. Among them is the demand for low-latency applications, including digital streaming from film, TV and music platforms. The rise in IoT connected devices, Artificial Intelligence (AI), and Machine Learning is causing a surge in Digital Transformation across almost every industry. Many organisations are designing new experiences, reimagining business processes and creating both new products and digital services that rely on innovative and resilient technologies to underpin them.
This is leading to more data being created and shared across the network, ultimately causing delays in transmission and download speeds, known as latency. To overcome such network congestion, data must therefore be stored and processed close to where it is generated and consumed – this trend is known as Edge Computing.
One of the challenges that emerges from the prolific growth at the Edge is the energy demands fuelling the transformation. The cost of energy production and the need to shift to more sustainable operations has long required designers of large data centres to embrace sustainability strategies. Now the same attention must be paid to the design of smaller facilities at the Edge.
Energy demands at the Edge
Today, various analysis suggests that data centres represent 1-2% of global electricity consumption, and by 2030 as much as 3000 TWh of energy will be used by IT, doubling the potential global electrical consumption. At the Edge, deploying 100,000 data centres, each consuming 10kW of power would create a power consumption of 1,000MW for the IT energy alone. Assuming a moderate Power Usage Effectiveness (PUE) ratio of 1.5 would mean these systems also emit the equivalent of 800k tons of CO2.
However, if each Edge facility was standardised and designed for a PUE of 1.1, we could reduce the total CO2 emissions to 580K tons annually. Clearly, there is a need to apply the same due diligence to reducing power consumption at the Edge as there has long been in the case of larger data centres. Consequently, there is also a clear benefit in producing pre-integrated systems where standardisation, modularity, performance and sustainability form fundamental components.
These building blocks offer users the ability to design, build and operate Edge data centres for greater sustainability, while energy efficient technologies such as lithium-ion UPS and liquid cooling can help to reduce burdens on the system, overcome potential component failures and allow for higher performance without negatively affecting PUE.
Open and vendor-agnostic, next-generation Data Centre Infrastructure Management (DCIM) platforms are also essential, not just from a remote monitoring perspective, but to drive energy efficiency, security and uptime. However, with Edge demands accelerating, how can industry professionals get an understanding of the impact Edge Computing is having on the world’s energy consumption, and how focusing on efficiency and sustainability can influence that?
Forecasting Edge energy consumption
Schneider Electric has recently developed a new TradeOff Tool, the Data Centre & Edge Global Energy Forecast, which helps users to model and create possible global energy consumption scenarios based on a set of pre-input assumptions. This includes the design of the physical infrastructure systems and its associated Power Usage Effectiveness (PUE) rating, as well as anticipated growth of data centre and Edge loads between now and 2040.
Based on these assumptions, the tool generates several forecast charts depicting total energy in TWh consumed by both Edge and centralised data centres, total IT energy (TWh), total energy mix comparing the percentages consumed in the Edge and central sectors and the IT energy mix between each sector.
In terms of data, the model utilises a capacity analysis created by IDC in 2019. From this model, Schneider Electric was able to derive the likely ratio of centralised data centres versus Edge IT load in 2021, which was split between 65% at the centre and 35% at the Edge. When predicting energy usage in 2040, the respective default ratios are 44% and 56%.
Based on these assumptions, the growth rate for centralised and Edge data centres is calculated at 6% and 11% annually. The tool allows these values to be adjusted by the user to reflect differing growth rates as conditions and/or assumptions change.
To derive the non-IT energy consumed by activities such as cooling and lighting, PUE values are estimated based on the assumption that as technology continues to evolve or becomes more efficient via future generations, anticipated PUE ratings will also improve. For example, the tool’s default values assumes that a centralised data centre’s PUE will improve from 1.35 in 2021 to 1.25 in 2040, and that the average PUE of Edge Computing facilities will improve from 2.0 in 2021 to 1.5 in 2040.
PUE ratios are also adjustable, meaning the user can leverage the tool under different possible scenarios to see the impact that Edge Computing has on energy consumption and carbon impact.
Final thoughts
With dependency on mission-critical infrastructure continuing to increase at a dramatic rate, it’s crucial that energy efficiency and sustainability become critical factors in the rollout of Edge Computing infrastructure. Greater accuracy, especially in terms of energy use, is essential and operators cannot afford to hit and hope, or become more efficient as they go.
While energy management software remains critical, it is the design of these systems which offers end-users a truly practical means of ensuring sustainability at the Edge. It requires greater standardisation, modularity, resilience, performance and efficiency to form the building blocks of Edge environments.
Further, by considering energy efficient deployment methodologies and embracing a culture of continuous innovation, operators can choose a more sustainable approach to Edge Computing.