Connecting: 216.73.216.168
Forwarded: 216.73.216.168, 104.23.197.193:35944
Diverse opportunities amid AI ‘generational paradigm shift’ | Trustnet Skip to the content

Diverse opportunities amid AI ‘generational paradigm shift’

03 September 2025

In our view, AI represents a generational paradigm shift in how consumers and enterprises interact with and use computing services.

By Bobby Edemeka,

Jennison Associates

The rapid deployment of generative artificial intelligence (AI) has triggered a global race to build high-density data centres – facilities that consume significantly more power than traditional workloads.

The International Energy Agency projects global data centre electricity consumption to double by 2030, with hyperscalers, data centre operators and asset managers committing significant capital to construct larger, high-capacity, next-generation data centres.

Ten years ago, a 30 megawatt (MW) data centre was considered large; today, a 200 MW data centre is considered normal and several hyperscalers are currently planning AI data centre campuses with power demands of 1 gigawatt (GW) or larger.

Estimates from McKinsey show that 18 GW of additional power capacity is expected to be needed to service just US data centres by 2030. For comparison, the total power demand for New York City is currently around 6 GW.

In other words, to meet the growing power demands of AI, it is expected that the US will have to add the equivalent of three New Yorks to its power grid by 2030.

 

Power and infrastructure demand

In our view, AI represents a generational paradigm shift in how consumers and enterprises interact with and use computing services. For enterprises, AI offers enhanced efficiency, superior execution, strategic differentiation and deeper insights.

For consumers, AI provides instantaneous access to information, personalised content experiences and advanced problem-solving capabilities.

The latest AI models – known as inference-time scaling or reasoning models – have the potential to deliver these capabilities at new levels of efficiency and effectiveness. These models can reflect, reassess, and revise answers, making them far more sophisticated and capable of handling complex, real-world tasks.

From an energy perspective, these reasoning models require significantly more compute power, as they engage in longer, more resource-intensive inference cycles.

As these models become the standard for AI interactions, they are expected to meaningfully accelerate demand for power and infrastructure.

The launch of DeepSeek R1, a generative AI model from a Chinese startup, challenged assumptions about China’s competitiveness in AI by matching top-tier US models in performance while operating on less powerful – and less expensive – hardware.

While DeepSeek’s performance relative to its cost is impressive, the company’s claimed training cost advantages can be misleading, as they are not directly comparable to those of models developed by leading US companies.

Nonetheless, as efficiency improves, we believe AI will become more affordable and accessible, accelerating adoption across consumers, enterprises, and the broader tech ecosystem.

This dynamic also illustrates Jevons Paradox – the idea that as technological efficiency increases, total consumption can actually rise rather than fall – suggesting that lower AI costs may ultimately drive greater demand for compute and power, not less.

 

Expanding prospects for investors

The power demands of AI are creating a wide and expanding opportunity set for investors. While nuclear energy often grabs headlines, the infrastructure required to support AI extends far beyond nuclear power generation alone.

Utilities are already aligning capital investments with tech-driven demand. New solar, wind and natural gas-powered generation – some built alongside data centres – are expected to, on a combined basis, play an even larger role than nuclear in meeting the growing power demands of AI.

Utilities are also investing in modernising their transmission and distribution grids to leverage underutilised generation capacity and ensure grid stability, especially for AI data centres, which typically require very high levels of reliability.

But meeting AI’s electricity needs also requires a broader ecosystem: data centres depend heavily on advanced systems to manage heat from high-density compute workloads, creating opportunities for companies specialising in cooling technologies.

Additionally, natural gas is expected to play a key role in bridging near-term energy needs due to the scalability and reliability of natural gas-fired power plants.

As AI’s growth accelerates, a diverse set of energy and infrastructure providers – large and small – stand to benefit from this structural shift. As AI adoption accelerates, so too does its electricity demand – reshaping the landscape of global infrastructure and opening a broad array of investment opportunities.

This shift is not cyclical but structural, driven by a new generation of AI models that require more compute, more power, and more infrastructure. For long-term investors, the rise of AI represents not just a technological revolution – but a foundational transformation of the global energy economy.

Bobby Edemeka is an income and infrastructure portfolio manager at Jennison Associates. The views expressed above should not be taken as investment advice.

Editor's Picks

Loading...

Videos from BNY Mellon Investment Management

Loading...

Data provided by FE fundinfo. Care has been taken to ensure that the information is correct, but FE fundinfo neither warrants, represents nor guarantees the contents of information, nor does it accept any responsibility for errors, inaccuracies, omissions or any inconsistencies herein. Past performance does not predict future performance, it should not be the main or sole reason for making an investment decision. The value of investments and any income from them can fall as well as rise.