The AI race in 2026 has shifted from "who has the smartest model" to "who can afford the power and capital to run them at scale." When Google issues century bonds and Musk eyes orbital data centres, the $700 billion question is whether anyone can sustain this pace.
Dutch telecom Odido confirms major cyberattack breached 6.2 million customers' personal data including names, addresses, bank account numbers details sparking serious identity theft concerns across the Netherlands
The $700 Billion Reckoning: Why Tech Titans Are Betting the Future on AI—And Borrowing Like Sovereigns to Do It
The AI race in 2026 has shifted from "who has the smartest model" to "who can afford the power and capital to run them at scale." When Google issues century bonds and Musk eyes orbital data centres, the $700 billion question is whether anyone can sustain this pace.
When even the world's richest companies need century bonds to fuel their AI ambitions, the race has entered uncharted territory.
The AI race in February 2026 has decisively shifted from "who has the smartest model" to "who can afford the power, chips and capital to run them at planetary scale." And the answer is becoming uncomfortably clear: perhaps no one can, at least not from operating cash flow alone.
When Google issues century bonds and Elon Musk eyes orbital data centres, the 2026 AI race has transcended Earth-bound economics.
Across the top four hyperscalers (Microsoft, Google, Amazon and Meta) the numbers are staggering. These tech titans have collectively pledged roughly US$700 billion in AI-related capital expenditure for fiscal 2026, a 60 per cent jump from last year. To put this in perspective, that's larger than the entire federal budget of most mid-tier countries. This isn't just a large-tech story; it's an allocation of capital on the scale of national industrial policy, executed by four corporate boards that move faster than parliaments.
The investment breakdown tells a story of escalating commitment. Amazon is deploying roughly $200 billion, a 56 per cent year-on-year increase, much of it earmarked for AWS infrastructure and custom silicon.
On 1,200 acres in Indiana, Amazon’s biggest AI data center is now operational, with half a million AWS Trainium2 chips entirely devoted to powering OpenAI rival Anthropic. Just over a year ago, the whole site was nothing but dirt and cornfields. Seven buildings are operating now, and once complete, the site will have around 30 buildings and consume some 2.2 gigawatts of power. Source You Tube CNBC
Alphabet is channelling $175 billion to $185 billion into servers, data centres and frontier model development at DeepMind, with roughly 60 per cent going into servers and 40 per cent into data centres and networking.
Microsoft is accelerating Azure AI outlays, including a 1-gigawatt "super factory" campus and its Maia 200 accelerators. Meta continues its aggressive ramp-up tied to the Llama roadmap and bespoke compute projects, driving 24 per cent year-on-year ad revenue growth with AI-driven ranking and creative tools doing visible work in the background.
This spending is front-loaded into short-lived assets: GPUs, custom accelerators, liquid-cooled racks and high-bandwidth networking whose depreciation clocks start ticking the moment they're powered on. The financial bet hinges on a two-phase cycle. Today's capital-intensive "training build-out" must eventually give way to profitable "inference scale-out" when models serve billions of queries at sustainable margins.
When Cash Flow Meets Century Bonds
The scale of investment is pushing even cash-rich giants towards unconventional financing. Google-parent Alphabet recently issued a rare 100-year bond as part of a multi-billion-dollar borrowing spree, raising almost $32 billion in debt in less than 24 hours.
Yahoo Markets and Data Editor Jared Blikre, explains what the bond is and what investors need to know about it. Source: You Tube Yahoo Finance
While pension funds eagerly snapped up the century paper to match their long-term liabilities, market strategists voiced concern. Bill Blain of Wind Shift Capital called the "AI hyperscaler debt-fest" a sign of late-cycle frothiness, warning that while brilliantly executed, it could signal a market top.
You don't term out your capital structure for a century unless you believe the current phase is both unavoidable and uncomfortably expensive. The turn to long-term debt instruments highlights a core tension: not even the immense operating cash flows of the world's most profitable companies can keep pace with the velocity and scale of AI investment required to remain competitive.
The market debate is not about whether AI creates value, but about timing, visibility and unit economics. All four companies are front-loading multi-year investments in training and inference capacity. Payback hinges on converting model innovation into high-margin software subscriptions, cloud consumption, advertising uplift and productivity gains faster than depreciation and operating cost headwinds erode margins.
Yet the revenue lines, while impressive, lag the rhetoric. Amazon's Bedrock platform is a multi-billion-dollar run-rate business, but still a modest annuity against deployed capital. Google touts a $625 billion cloud backlog and processes over 10 billion tokens per minute through Gemini. Microsoft points to 50 per cent throughput gains in OpenAI inference workloads thanks to Maia 200 and insists short-lived GPU investments are already largely contracted.
None of this answers the central question: can AI services grow fast enough, at high enough margins, to keep pace with depreciation schedules measured in quarters rather than decades?
From Model Benchmarks to Inference Economics
What investors are really voting on now is not which lab has the cleverest language model, but whose inference economics scale. Tokens-per-watt and cost-per-query have become board-level metrics. The engineering challenge remains vast, but the financial one is now equally daunting, and it will shape not only who wins the AI race but which balance sheets, and which nations' credit markets, are ultimately drafted into the fight.
In 2025, the AI narrative was about breakthrough demos and benchmark scores. In 2026, it's about who can carry a $700 billion capex burden without tripping covenants or exhausting investor patience. The race now resembles a contest between quasi-sovereigns, consuming power and capital at rates that stress local grids and global bond markets alike.
Supply of top-end accelerators remains constrained, pushing hyperscalers to build their own chips and fight for every marginal gigawatt. Amazon says every unit of AI capacity is "immediately useful" and is scrambling to add gigawatts of power by 2027. Meanwhile, a new generation of startups (frontier labs like Anthropic and "frontier-adjacent" players building agents, orchestration layers and domain-specific models) must compete for the same GPUs, the same power and the same wary pool of investors.
The Final Frontier: When Orbit Becomes Economics
Yet even as terrestrial data centres strain power grids and credit markets, an entirely different proposition is emerging. This month, Elon Musk announced the merger of xAI and SpaceX, a combination with trillion-dollar prospects that sent the internet into overdrive. The move drew widespread, if somewhat contrarian, approval.
In this 46-minute internal presentation, xAI marks 30 months of progress since its founding in July 2023. The team shares detailed updates on AI model training, large-scale infrastructure, API development, data systems, forecasting tools, and the rapid expansion of Grok. Source Your Tube
The narrative was clear: if anyone must pursue space-based AI infrastructure, it's Musk. Even Google's Sundar Pichai nodded towards orbital ambitions in a CNBC interview late last year, just as markets peaked before the Northern Hemisphere winter correction.
The logic is compelling, if audacious. Space offers effectively unlimited solar power, zero cooling costs in the vacuum of space and the ability to deploy compute at scales that would be impossible on Earth without building dedicated power plants.
Musk's vision of processing AI workloads in orbit and beaming down results could theoretically slash the cost per token by orders of magnitude, fundamentally disrupting the economics that currently justify $700 billion in terrestrial capex.
The implications are staggering. If orbital data centres prove viable, they could undercut the entire premise of ground-based AI infrastructure investment, turning today's multi-billion-dollar campus builds into stranded assets.
The hyperscalers are betting that inference demand will materialise fast enough to justify their Earth-bound investments.
In a sector where today’s breakthrough is tomorrow’s baseline, Elon Musk’s latest comments underline a defining reality of the AI race: speed is strategy. As xAI pushes through scaling constraints and deepens technical alignment with SpaceX, the company is signalling that competitive advantage will be built less on current capability and more on execution tempo.
The long-term framing — from advanced AI integration to conceptual off-world infrastructure — reflects a mindset that treats AI not as a product cycle, but as a civilisational platform shift.
As Musk put it during the xAI 30-month progress meeting:
“So it’s really an incredible amount of work in a very short period of time and it's important to consider for competitiveness of any technology company what matters is not the position at any point in time but what is your velocity and acceleration — and if you're moving faster than anyone else in any given technology arena you will be the leader.”
Musk is betting that the laws of physics and orbital mechanics offer a better path to profitable AI at scale.But space deployment faces its own brutal economics: launch costs, radiation hardening, latency for real-time applications and the challenge of maintaining and upgrading hardware in orbit. The capital required to prove the concept may rival the hyperscalers' own spending, and the timeline extends beyond even Google's century bonds.
What's clear is that 2026 has delivered another leap in the AI race, not just in model capabilities but in the sheer audacity of the infrastructure bets being placed. Energy costs continue to surge as demand for AI compute grows exponentially. Power grids are being strained to breaking point. And now, the prospect of moving the entire stack into orbit adds another layer of complexity to an already dizzying financial equation.
The $700 billion question isn't just when terrestrial AI investments will pay off but whether the ground beneath them will shift entirely. Financial endurance, technological superiority and now orbital ambition are converging into a race that transcends conventional business strategy.
We're watching something closer to a new space race, funded by corporate treasuries and bond markets, with stakes that dwarf anything seen in the technology sector's history.
The only certainty? The costs will keep climbing, the bets will keep escalating and the winners will be those who can endure the burn rate long enough to reach escape velocity, whether that's measured in margins or in miles above sea level.
Get the stories that matter to you. Subscribe to Cyber News Centre and update your preferences to follow our Daily 4min Cyber Update, Innovative AI Startups, The AI Diplomat series, or the main Cyber News Centre newsletter — featuring in-depth analysis on major cyber incidents, tech breakthroughs, global policy, and AI developments.
Sign up for Cyber News Centre
Where cybersecurity meets innovation, the CNC team delivers AI and tech breakthroughs for our digital future. We analyze incidents, data, and insights to keep you informed, secure, and ahead.
With 2.5bn active devices, Apple commands an AI footprint unmatched by any model laboratory or cloud provider. The company is converting hardware ubiquity into a competitive moat, bypassing the race for ever-larger models to integrate AI into a distribution network already serving billions.
Davos 2026 revealed an elite world torn between ambition and anxiety as AI promises growth while threatening jobs, power grids and geopolitics. From warnings of mass workforce disruption to energy bottlenecks and strategic rivalry, leaders framed AI as both engine risk shaping decades ahead
January 2026 reveals AI’s true battleground: not just code, but power, chips, and physical infrastructure. From TSMC and ASML shaping compute supply to robots, exoskeletons, and soaring energy demand, the race for intelligence now spans factories, grids, and even orbit above and below too now
Where cybersecurity meets innovation, the CNC team delivers AI and tech breakthroughs for our digital future. We analyze incidents, data, and insights to keep you informed, secure, and ahead. Sign up for free!