NVIDIA, Emerald AI and the new electricity frontier of artificial intelligence

NVIDIA and Emerald AI are redefining the AI race through electricity, distributed compute and flexible AI factories. The next trillion-dollar infrastructure layer may not be chips alone, but the orchestration of power, grids, micro data centres and the emerging inference economy.

NVIDIA, Emerald AI and the new electricity frontier of artificial intelligence
Futuristic suburban communities powered by renewable energy and local AI data centres, showcasing the rise of the distributed inference economy.

The next AI race will not be won only inside a model lab. It will be won at the edge of the electricity grid, inside substations, behind batteries, across fibre networks and in the software layer that decides when intelligence should compute, pause, shift or sell flexibility back to the power system.

That is why NVIDIA’s partnership with Emerald AI is more than another infrastructure announcement. It is a signal that the artificial intelligence economy is becoming an energy economy. NVIDIA and Emerald AI are working with AES, Constellation, Invenergy, NextEra Energy, Nscale Energy & Power and Vistra to create “flexible AI factories” that can connect to power grids faster, generate AI tokens and operate as grid-supporting assets rather than fixed electricity loads.

The company behind the shift

Emerald AI is led by Dr Varun Sivaram, its founder and CEO. He is not a typical software founder. His background sits across energy technology, policy and capital markets. He is a senior fellow for energy and climate at the Council on Foreign Relations and has held senior roles connected to Ørsted, ReNew Power, McKinsey and US climate diplomacy.

That matters because Emerald’s thesis is not simply “AI needs more power”. Its deeper claim is that AI workloads can become flexible. In simple terms, not every AI computation has to happen at the same second, in the same building, at the same intensity. Some workloads can be delayed, shifted, throttled or coordinated with batteries and local generation.

Emerald’s Conductor platform is designed to orchestrate this flexibility. NVIDIA’s DSX Flex software and Vera Rubin DSX architecture bring the compute layer. Emerald brings the energy orchestration layer. Together, they are trying to turn AI factories from grid liabilities into grid participants.

Varun Sivaram discusses the critical power constraints facing the AI industry, the shift toward "power-flexible" data centers, and how Emerald AI is enabling massive compute clusters to operate in harmony with the electric grid. Source SAIL Media

Why capital is moving now

Emerald AI raised US$25 million in a strategic expansion round in March 2026, with backing from NVIDIA’s NVentures, Eaton, GE Vernova, Radical Ventures and others. The company says the capital will help scale its platform for data centres that align energy use with grid capacity.

This funding is arriving because the bottleneck has changed. For years, the market assumed AI scarcity would be chips. Then it became cloud capacity. Now the binding constraint is power.

The International Energy Agency projects global data-centre electricity consumption will more than double to about 945 TWh by 2030, roughly the electricity consumption of Japan today. In the United States, data centres are expected to account for nearly half of electricity demand growth between now and 2030.

McKinsey has also warned that data-centre load could represent 30 to 40 per cent of net new US electricity demand to 2030.

That is the capital-market reason Emerald matters. The company is selling time to power. If its software can help data centres connect sooner, reduce peak demand, preserve AI performance and avoid waiting years for grid upgrades, it becomes a strategic infrastructure company.

The early cases and demonstrations

Emerald is not only selling theory. A UK demonstration with National Grid showed its software could adjust data-centre power usage in real time, flexing demand up or down to support the power system. National Grid said this could ease peak pressure and unlock capacity.

In Hillsboro, Oregon, Emerald and NVIDIA demonstrated AI factories reducing power usage by up to 25 per cent during simulated grid stress events. Emerald AI has also worked with National Grid on a UK-first live trial aimed at showing AI data centres can adjust energy use in real time, helping accelerate grid connections.

This is where the story becomes exciting. The conventional data centre model says: build a giant facility, demand a giant grid connection, consume power continuously. The flexible model says: build compute as an intelligent load that can interact with the grid, local batteries, local generation and market signals. That may sound subtle. It is not. It is the difference between AI as a strain on society and AI as an active participant in the energy system.

Jensen Huang’s industrial AI doctrine

Jensen Huang has been arguing that AI is not merely a software platform, but a new industrial system. His language around “AI factories” is deliberate. Factories consume raw material and produce goods. AI factories consume electricity and data, then produce tokens, decisions, predictions and intelligence.

Nvidia CEO Jensen Huang says the artificial intelligence industry always needs more energy. He speaks to Ed Ludlow at the Consumer Electronics Show. Source Bloomberg

NVIDIA’s own commentary says flexible AI factories could use co-located generation and hybrid power strategies to accelerate “time to power” while supporting the broader grid.

This fits NVIDIA’s broader pattern. It is investing not only in GPUs, but in fibre, networking, cooling, data-centre architecture and energy orchestration. Reuters reported NVIDIA has made a large prepayment to Corning to expand US optical fibre production, showing that the company is pushing deeper into the physical supply chain of AI infrastructure. (wsj.com)

The strategic logic is clear. Whoever controls the operating layer between compute, power, networking and workload orchestration will not just sell chips. They may define the economics of the inference economy.

Community data centres and the distributed intelligence transition

The more profound implication is not simply the emergence of hyperscale AI campuses. It is the transition toward geographically distributed inference infrastructure. The market is beginning to recognise that future AI capacity cannot rely exclusively on a handful of concentrated mega-facilities tethered to already constrained metropolitan grids. The economics, regulatory pressures and energy realities increasingly favour a hybrid architecture consisting of sovereign hyperscale nodes complemented by regional inference clusters, industrial micro data centres and energy-aware compute infrastructure embedded closer to enterprise, industry and community demand.

This is where Emerald AI’s strategic positioning becomes materially significant. The company is not merely attempting to improve data-centre efficiency. It is attempting to create a new operating layer between compute demand and energy availability. If successful, that orchestration layer could fundamentally alter how AI infrastructure is financed, approved and deployed globally.

For capital markets, this creates an entirely new investment category sitting at the intersection of infrastructure, energy, utilities and artificial intelligence. Historically, data centres were assessed predominantly through real estate, occupancy and connectivity metrics. The next generation may instead be valued through dynamic energy responsiveness, grid participation capability, co-located generation, storage integration and inference throughput efficiency.

For governments, particularly in Australia, the implications are equally strategic. Nations with abundant renewable resources, stable regulatory environments and sovereign land availability may increasingly compete not only for cloud infrastructure, but for distributed AI industrial ecosystems. The conversation is rapidly shifting from “where can data centres be built?” toward “which jurisdictions can sustain intelligent compute economies at scale without destabilising national energy systems?”

This introduces a new geopolitical layer to the AI race. Sovereign compute capacity, electricity resilience, transmission infrastructure, energy storage and localised inference capability are becoming intertwined elements of national competitiveness. In this emerging framework, AI infrastructure begins to resemble critical industrial infrastructure rather than merely digital infrastructure.

The result may ultimately be the decentralisation of intelligence itself. Smaller, strategically located inference hubs operating in coordination with regional grids, battery systems and renewable generation could become economically preferable to endlessly concentrating compute inside a limited number of hyperscale corridors. That possibility carries enormous implications for regional development, private capital formation, energy investment and the future structure of the AI economy itself.

Why this represents a structural turning point

What makes this moment consequential is that the industry is finally confronting the physical economics of artificial intelligence. For nearly a decade, the dominant narrative surrounding AI centred on models, software capability and computational scale. Yet the next phase of expansion is increasingly constrained not by algorithms, but by the availability, cost and orchestration of power.

This changes the investment thesis surrounding the entire sector.

The strategic value chain is broadening beyond semiconductors into the underlying systems that enable inference at industrial scale. Electricity access, transmission capacity, flexible load management, cooling systems, optical infrastructure, battery integration and energy-aware orchestration are now becoming foundational components of the AI stack itself.

NVIDIA appears acutely aware of this transition. Jensen Huang’s repeated references to “AI factories” are not rhetorical branding exercises. They reflect a deliberate reframing of artificial intelligence as industrial production infrastructure. In this framework, compute becomes analogous to manufacturing output. Electricity becomes feedstock. Inference becomes economic production.

That distinction is critical because it explains why NVIDIA is now extending influence upstream and downstream across the broader infrastructure ecosystem. Its involvement with fibre manufacturing, networking architecture, sovereign AI deployment models and energy orchestration suggests the company is positioning itself not simply as a chip supplier, but as a systems architect for the emerging inference economy.

Emerald AI’s importance therefore lies less in its current scale and more in the direction it signals. The company represents one of the earliest serious attempts to synchronise AI compute behaviour with real-world energy market dynamics. If that orchestration model matures, it could materially reduce deployment friction, accelerate time-to-power, improve infrastructure economics and unlock new classes of distributed AI deployment previously considered commercially impractical.

This is why the market response has been so significant. Investors increasingly understand that the next trillion-dollar layer of AI may not reside solely inside frontier models, but within the infrastructure systems that govern how intelligence is generated, distributed and sustained across economies.

The inference economy is beginning to move beyond software abstraction and into the realm of national industrial capability. Those who can successfully integrate compute, energy and infrastructure into a coherent operating system may ultimately define the next era of technological power.


Get the stories that matter to you.
Subscribe to Cyber News Centre and update your preferences to follow our Daily 4min Cyber Update, Innovative AI Startups, The AI Diplomat series, or the main Cyber News Centre newsletter — featuring in-depth analysis on major cyber incidents, tech breakthroughs, global policy, and AI developments.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Cyber News Centre.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.