As everyone rightly expected, AI has so far featured very prominently at this year’s pro AV and broadcast trade shows. From AI-powered content localisation services with the ability to make media streaming significantly more streamlined, to the use of AI technologies that can harvest, sift and interpret data in meaningful and commercially-beneficial ways, there has been an abundance of solutions aimed at making AI accessible to the complete spectrum of end-users.
Of course, some of these solutions have been more ‘AI-ready’ than others. There is undoubtedly a strong – and, arguably, quite healthy – strain of scepticism about AI and ML claims sometimes being appended to products as little more than marketing gimmicks. The distance between ‘theoretical’ and ‘achievable’ has, it might be argued, rarely been so great in recent memory.
However, the productisation of AI is far from being the only concern. You do not have to look far to encounter uncertainty and confusion, from the extremely variable regulatory outlook for AI in different territories around the world, to the impact on specific sectors and markets, and – perhaps the most significant issue of all – the quite staggering implications widespread AI deployment will have for power consumption and sustainability. In particular, it seems that the data centre infrastructure of today isn’t capable of supporting some of the more experiential, XR-reality AI applications currently being discussed – more of which anon.

So what follows is a deep-dive into: the current outlook for AI applications; the extent to which an understanding is emerging about the long-term power consumption and data centre requirements of AI, along with the related environmental impact; the technical advances that need to happen, including more efficient cooling and operational practices; and, finally, the not uncomplicated geopolitical context that could throw all manner of ‘spanners in the works’ over the next few years.
A PASSING THOUGHT
Craig Bury is CTO of Three Media Consulting, whose specialisms include cloud innovation and transformation, IP and virtual operations, process efficiency and technology strategy. Just back from the latest NAB Show at the time of interview, Bury indicates that the event encapsulated the presently rather vexed AI landscape.
“There’s a lot of AI out there, and it was certainly everywhere [at NAB],” he confirms. “The problem is that the majority of it, in my professional opinion and without wishing to be derogatory, appears to be almost a passing thought, and is in many cases not really integrated into the product sets. Now, there are vendors out there who have spent the time to integrate AI. Most are using various cloud services to augment their products, mainly to increase workflow and process efficacy. A number of these are using AWS, Azure and GCP as well as other hyper-scalers to host AI stacks and models from other suppliers.”
Bury cites a couple of vendors whose AI solutions impressed him at NAB. Microsoft and UIC Digital showed Content Understanding and Chat GPT on Azure performing automated highlight generation and dynamic voiced commentary for basketball content, while Ad Signal showcased a product called Assure that allows users to track the performance of their campaigns and connect playout to endpoint delivery with near-real-time playout detection.
It is in the more well-defined areas of content production and workflow – where AI can bring greater efficiencies and insights – that Bury evidently feels AI has the greatest potential in the short to mid-term. But as for some of the more vauntingly ambitious, extended reality-type applications that are being bandied around at present? “I’d say that, in some cases, they are aspirational at best.”

A fundamental reason why the more pioneering AI applications might be difficult to realise is the availability of data centres with the capacity to serve them. Installation suggests to Bury that there appears to be a dearth of detailed research into the infrastructural implications of AI. “And therein lies the crux of the problem,” he responds. “I’ve been seeing calculations of requirements of around 120 kilowatts per rack in the coming year or two. The suppliers [like] Nvidia with its next generation GPUs will approach power requirements approaching 600 kilowatts per rack in the data centres, which is clearly liquid cooled; [Nvidia GPU architecture] Blackwell and all of these things that present huge power demands. Bearing in mind I go back to the days of building data centres with 4 kilowatts per rack and thinking that seemed a lot…!”
But the fact remains that “the big thing now facing data centres in places like the Ashburn area of northern Virginia, for example, is that they don’t have the power infrastructure or generating capacity [they will need as AI develops]”, he says. “They’re talking about years to get the infrastructure built to support these applications. So at some point a line does have to be drawn and people need to recognise the reality, which is that there is a timeline and some of these things may take decades to deploy.”
In short, there is an acute need for a tone of pragmatism that has sadly been lacking from a lot of the AI discussion so far. “The pragmatic side is you might not have the power generation, or the infrastructure needed to deliver that power. There’s also a whole consumer side to consider, such as media companies having a cogent business plan that allows them to properly use those resources. They might have to utilise X amount of resources to a certain degree in order for it to make sense from a business perspective, and I’m guessing a lot of them won’t be able to do that unless they can scale up properly. So there are all of these other factors that have to be thought about very carefully.”
SUPPLY & DEMAND
A sense of quite how much needs to be done to meet the AI needs of the future is gleaned by a conversation with Luisa Cardani, head of data centres at UK technology trade association techUK. Among its activities, techUK has operated a Data Centres Council since 2009, with 20 individual members – who are elected every two years – representing the full spectrum of business interests and models across the data centre sector, and providing strategic direction related to data centres.
“The demand for data centres has been steadily increasing and is set to increase at a much faster rate than it has before due to the continued advancements of AI and other emerging technologies,” confirms Cardani. “The growth in demand for digital products and services, and the advent of data-driven innovation and leaps forward in technology, such as AI, will need to be matched by the supply of data centre capacity.”

Between 2020 and 2024, it is estimated that the total capacity of UK data centres grew by an average of 10% each year. The same rate in the coming years would satisfy “future demand to some degree, but it would be unlikely to match anticipated demand increases,” says Cardani. “Projections for future demand for data centre capacity have been put at between 10-20% a year. This is a huge economic opportunity that will only grow as data centres continue to facilitate innovation advances within the tech sector. While the demand for data centres is there, the supply to meet the demand is a concern.”
Providing further context, a growth date of 10% a year would put data centres as one of the fastest-growing industries in the UK – faster, for point of comparison, than life sciences (6%) and the digital and creative industries (8%). But if growth was pushed to 15% a year, data centres would be approaching the expansion rates of gaming (16%) and FinTech (19%). However, notes Cardani, “this supply increase can only be met by a favourable environment to build and operate data centres in the UK”.
COOL RUNNINGS?
Nonetheless, it is perfectly understandable that as more data centres are greenlit, there are also increased public concerns about sustainability and long-term environmental impact. In particular, there are escalating worries about the considerable cooling requirements of data centres, with the US-based National Renewable Energy Laboratory indicating that cooling typically accounts for 40% of a data centre’s total power consumption.
It’s true that various approaches and design methodologies are being explored to help reduce the problem. For instance, there is a shift to building data centres in locations with cooler climate and/or more adjacent sources of cold water. Resource optimisation is also being explored through initiatives such as prefabrication and modular (PFM) data centres – allowing construction to take place off-site – and the exporting of excess heat, “which has the potential to improve their own sustainability and contribute to the UK’s green transition.”

Cardani also points to steps being taken to “mitigate environmental impacts from energy continuous energy consumption”, such as Renewable Energy Guarantees of Origin (REGO) backed tariffs and the use of PPAs (power purchase agreements) to bring new renewable capacity to the grid. “Furthermore, many data centres participate in the Climate Change Agreements (CCAs), which is a government scheme to encourage greater uptake of energy efficiency measures amongst companies in energy intensive industries.”
The demands on water resulting from a significant increase in data centre numbers have lately become a topic of mainstream news coverage in countries including the UK, where decades of neglect surrounding the construction of support infrastructure like reservoirs, have contributed to fears that supplies of drinking water to homes and businesses could be under threat. However, Cardani says that, in the UK, “most data centres rely on systems that are not water intensive. Many are also looking at innovative liquid-cooling systems that typically involve closed-loop distribution designs and evaporative thermal emitters that minimise water consumption by recirculating it within the same system.”
Nonetheless, there are a number of industry initiatives taking place that aim to address water-related concerns. For instance, the CNDCP (Climate Neutral Data Centre Pact) brings together infrastructure services and data centres in Europe with the aim of achieving climate neutrality by the end of 2030. Among other measures, it has helped operators to work on improving their annual target for water usage effectiveness, and explore other avenues such as rainwater harvesting or boreholes to reduce the reliance on water that is suitable for drinking.
Looking ahead, adds Cardoni, “increasing regulations are expected to put more emphasis on water usage disclosure, requiring data centres to monitor and report their water consumption more transparently. Through techUK, the industry is already engaging with the Environment Agency to track and improve water management practices.”
FUTURE DESIGNS
Providing a specific data centre perspective is Simon Anderson, VP construction at VIRTUS Data Centres, which currently has 185MW of IT Load operational in the UK, split over 11 sites. The company is presently developing two new sites in Slough and Stockley Park, totalling 43MW, and due to go live between June 2025 and February 2026. The company is also working on a campus site in Buckinghamshire and is constructing two “megacampus” projects in Berlin.
Anderson says that VIRTUS’ data centres are all “designed to be as efficient as possible not only when operating, but also in terms of equipment deployed. By closely matching the capacity of the equipment to the design loads, we are deploying the smallest possible number of items of plant – reducing physical assets needed on site and therefore the overall impact of manufacture (less units, less impact).”

In addition, he says, “there is a far greater focus on how our contractors manage their own processes to minimise the environmental impact. Examples of this are: water for flushing the cooling system is filtered and recycled (as opposed to disposing to drains); site plant is, wherever possible, electrically powered (reducing emissions from traditional diesel-powered machinery); and separation and recycling of all site waste.”
Anderson also alludes to PUE (Power Usage Effectiveness), which is a metric employed to measure the energy efficiency of a data centre. All new VIRTUS facilities being designed to offer a PUE of less than 1.2, but “with the introduction of liquid cooled chipsets, VIRTUS sees further opportunities to improve on the PUE further; [indeed] in an an existing deployment of this technology at one of our site, a PUE of 1.1 was achieved. However, with the improved efficiency of the chipsets, it may be possible to further improve this.”
CHAOS THEORY
Ah yes – chipsets. While the development of a new generation of chips able to support AI isn’t in much doubt, the same cannot be said of the surrounding geopolitical context. The Trump regime’s imposition of trade tariffs has made chip production a particular source of scrutiny, not least as Asia is currently responsible for producing more than 80% of the world’s semiconductors. If it’s the US government’s intention to dramatically boost domestic chip production, then it doesn’t seem to have taken account of the significant design & build times involved in creating new foundries – which is typically around four years.
Factor in the many other troubling facets of the current geopolitical malaise, along with the deeply variable legislative outlook – for instance, some parts of the EU AI Act have now come into force, while consultations are still ongoing in many other countries – and it’s clear that this is a particular area of development where nothing should be taken for granted. That said, it appears that some of the more outlandish predictions for AI applications are likely to be far longer in the realisation than might have been expected, ensuring that the industry focus is more firmly maintained on those solutions that improve the efficiency of existing activities.
Most positively, this could also have the effect that some of the more troubling, and potentially damaging, AI applications are never quite able to leave the R&D labs.