Artificial Intelligence will continue to propel change across the datasphere

Artificial Intelligence will continue to propel change across the datasphere

Yakov Danilevskiy – Vice President Strategy, Channels and Marketing, Schneider Electric, explores that with ever more compute-intensive workloads on the horizon for business including AI, AI itself must be leveraged to ensure efficiency, security and sustainability

Yakov Danilevskiy – Vice President Strategy, Channels and Marketing, Schneider Electric

The impact of Artificial Intelligence (AI) is being felt widely across the world of data infrastructure. While AI workloads are changing how we build and operate data centres, AI’s influence is also being seen elsewhere, as it accelerates existing development trends in areas such as industrial IoT, cloud computing, Edge Computing, data centre operations and beyond.

Added to this is the increasingly close Quantum Computing, and the next 12-18 months are an exciting time for the information and communications technology (ICT) industry.

AI and optimisation

AI will continue its rapid acceleration, providing insights and optimisation, as well as having an impact on how data centres are designed as AI workloads grow and develop. We will build on our expertise to help businesses not just adopt but optimise for AI, such as with the recent investment in specialist liquid cooling technologies. Liquid cooling is becoming an increasingly important aspect of infrastructure that will support tomorrow’s ultra-dense workloads, and will require specialist implementation knowledge and expertise for new designs and retrofitting.

Automation in business generally, and in the data centre, will see more being done without human intervention, freeing people for more innovative and value-driven work.

It is also predicted that we will see increased usage of AI-driven digital assistants, such as tutors, coaches, carers, therapists and even lawyers, according to analysts. AI assistants will be leveraged in our jobs too, aggregating search engines, knowledge bases and professional resources. Furthermore, AI assistants are also being trained to monitor and improve teamwork, identifying bottlenecks and impediments, and assisting people to work more closely and effectively together.

The application of AI has also been recognised as having the potential to mitigate 5 to 10% of global greenhouse gas emissions, according to research from Boston Consulting Group. In this context, data centres have evolved to not only handle AI’s computational intensity but to integrate smarter, more efficient systems that optimise energy use, reduce latency and ensure seamless connectivity.  

Edge Computing
IDC sees Edge Computing market growth hitting US$350 billion by 2027. AI will be a huge influence here, with Edge deployments bringing the benefit of the technology closer to where it is needed. It will also serve to optimise Edge deployments generally.

And there are other opportunities for refinement and optimisation too. Accelerated computing is described as the use of specialised hardware to dramatically speed up work, using parallel processing that bundles frequently occurring tasks. It actively offloads demanding work that can potentially bog down CPUs that typically execute tasks in serial fashion.

It is being argued that accelerated computing can help businesses strike the right balance by reducing overall energy consumption and costs for compute-heavy workloads. It is true that GPUs may consume more power at peak times when compared to CPUs, but they can also complete tasks much faster. When comparing total energy consumption, GPUs use less energy and deliver faster results, making them a superior option for tasks like developing Large Language Models and simulations.

Evolving AI
The evolution of AI has been divided into two main types: training, which involves building models, and inference, which is used for decision-making, content generation and automation. Recently, AI inference has advanced significantly within data centres, particularly due to the emphasis on edge computing for real-time data processing.

Contrary to initial expectations that small, efficient inference clusters would be built closer to users, many large companies are repurposing their substantial training clusters for inference tasks because of their availability. This has led to the emergence of data centre inferencing, where large training clusters are used for inference tasks, even though they are overpowered for the purpose.

The trend is expected to gradually shift towards Edge Computing for inference, as Edge devices offer better efficiency, lower latency, higher data security and customisation. However, until then, data centre inferencing will continue to be the primary method, despite its inefficiencies for smaller tasks.

New workloads
Compute-intensive technologies such as blockchain, will continue to grow and will have unique requirements as it does. Supply chains, international finance systems and the likes of identity and credentials management will all grow as use cases, with others emerging too.

Another major trend already having an impact is the tokenisation of real-world assets (RWA). This will have a direct impact on data infrastructure as more enterprises seek to hold, move and trade digital assets, with implications for cybersecurity, storage, cryptography and other data disciplines. With benefits such as near-instant transfer, settlements and reconciliation, tokenisation will also be a major element of environmental reporting and in areas such as carbon credits.

New architectures
Decentralisation in certain areas will also affect how applications and services are hosted and consumed. There are already predictions that decentralisation will facilitate new agile business ecosystems.

The needs of modern businesses, argues Accenture, along with advancements in digital technologies, are forcing a more decentralised digital operating model on many companies today. The monolithic applications of the past, it says, are giving way to purpose-built applications and services that can be easily scaled using cloud, APIs and microservices. While centralised models offered greater control and visibility over network data, decentralised digital services are becoming essential for the adoption of new technologies and faster decision-making, while increasing resilience and security at the Edge. Those businesses that have already got a handle on Edge Computing will likely have a head start in understanding and accommodating decentralised architectures.

Cybersecurity
In the same way that AI and ML are being used in data centre operations for predictive and preventative maintenance, those same technologies are being used to similar effect in cybersecurity. AI supports the automation of threat detection and response, as well as monitoring and prevention functions.

AI will be a constant thread through all of these developments, accelerating their effects. Businesses need to prepare for an ‘AI with everything’ future, where AI is embedded, not tacked on.

Even cybersecurity professionals acknowledge that in this area, there are both promise and peril. AI-driven cyberattacks are inevitable, as is AI automation. Consequently, AI must be seen as both a tool and a vulnerability and treated as such.

Quantum Computing: Prepare now
Quantum Computing is developing at a rapid pace too, and even though experts believe practical quantum computers are still some way off, there are practical considerations before then such as post-quantum cryptography.

Quantum supremacy, where quantum computers do what no classical machines can, may still be in the future, but when it arrives, current non-quantum systems need to be ready. Cryptography is a first step here, and the time to prepare is now.

Post-quantum cryptography is an area of activity where cryptographic algorithms have been developed to resist the power of quantum computers. The US standards institute has coordinated such work for some time now and has already released viable algorithms to allow businesses to secure sensitive data and processes, such as government, public services, financial services and national security.

These cryptographic measures are yet another example of the kind of compute-intensive workloads that are likely to be faced by data centre owners and operators in the near future, which must be implemented efficiently, securely and with sustainability front of mind.

Looking forward, AI will be an ever more important workload for businesses, but it will also be a critical tool in enabling and managing the more complex and sophisticated data architectures necessary.

With considerations for cybersecurity, as well as future developments in computing, AI will be more integrated than ever before, requiring new levels of understanding and orchestration. Through careful application and broader awareness of impact, both positive and negative, businesses can optimise not just their data infrastructure but their whole organisation for the benefits.

Browse our latest issue

Intelligent Data Centres

View Magazine Archive