OpenAI's Disappointing Earnings Trigger Global Reassessment of AI Infrastructure Investment Logic

TubeX Research avatar
TubeX Research
4/28/2026, 11:01:08 AM

OpenAI’s Underwhelming Financial Performance and User Growth Spark Reassessment of AI Infrastructure Investment Logic

Recently, OpenAI officially confirmed that it missed both its FY2025 revenue target and its milestone of “1 billion weekly active users.” In an internal memo, CFO Sarah Friar stated candidly: “The company is not yet ready to go public,” and warned of potential failure to fulfill large-scale compute procurement contracts as scheduled. This rare level of transparency has, for the first time, systematically stripped away the “glamour filter” long obscuring the commercialization journey of large language models (LLMs), spotlighting core industry contradictions long avoided: weak user stickiness, unclear monetization pathways, and unsustainable capital expenditures. Its spillover effects are rapidly reshaping global AI supply-chain valuation frameworks—exerting significant downward pressure on AI-related stocks in both Hong Kong and U.S. markets, as well as dampening investor sentiment toward domestic AI infrastructure themes.

Commercialization Bottleneck: The Chasm Between “Technical Leadership” and “User Retention”

OpenAI had targeted over $5 billion in annual revenue for FY2025, underpinned by three monetization pillars: ChatGPT Plus subscriptions, enterprise API usage, and B2B integrations. Reality, however, tells a different story:

  • Average daily usage time among free users has declined from a peak of 12.7 minutes in 2023 to under 6 minutes;
  • Paid conversion rate for ChatGPT Plus remains stuck at ~4.2%, far below the initial projection of 12%;
  • Quarterly growth in enterprise API call volume has contracted for two consecutive quarters, with major clients—including Salesforce and Shopify—accelerating development of in-house alternatives.

More critically, behavioral data reveals that over 68% of active users engage fewer than three times per month, and 72% of all user sessions are concentrated in low-value use cases—information retrieval and basic text generation. This exposes a stark reality: today’s LLMs remain trapped beneath the ceiling of “tool-like applications,” lacking high-frequency, mission-critical, or emotionally resonant “killer use cases.” As the initial wave of technological awe recedes, users revert to rationality—opting for free tiers when paid offerings deliver no irreplaceable value.

The Capital Expenditure Paradox: The Arms Race Hits an ROI Cliff

To support model iteration and service scaling, OpenAI has signed GPU procurement contracts totaling over $12 billion in the past 18 months—primarily for NVIDIA H100/B100 clusters and custom liquid-cooled data centers. Yet the CFO’s warning about “fulfillment risk” cuts to the heart of the matter: annual per-GPU training cost reductions average only 8%, while model parameter counts surge 220% year-on-year—causing commercial value per unit of compute (revenue per PFLOPS) to collapse steadily. Third-party estimates indicate that a single GPT-5 training run now consumes over $28 million in electricity alone—whereas equivalent compute investment in cloud computing or game rendering would generate more than triple the cash flow. This “the more we invest, the deeper we lose” negative feedback loop is forcing markets to fundamentally re-examine the logic underpinning AI infrastructure investment: When model improvements no longer linearly drive revenue growth, are trillion-dollar compute investments merely building a new “digital dam”—a massive, underutilized reservoir of expensive capacity?

Ripple Effects Across the Supply Chain: Valuation Recalibration from Chips to Applications

OpenAI’s signal has triggered cross-market chain reactions:

  • AI chip sector hit hardest: NVIDIA’s stock fell 8.3% in one week; analysts lowered their forecast for its 2025 data-center business growth to 25% (from an earlier 42%). Domestic GPU makers such as Cambricon saw their Hong Kong-listed shares slump 15%, as investors fretted over slowing procurement momentum from LLM vendors.
  • Cloud service providers under pressure: Microsoft Azure AI services’ gross margin reportedly dropped to 31%—well below the overall cloud division’s 48%. Alibaba Cloud and Huawei Cloud have likewise suspended tenders for certain large-model–dedicated compute clusters.
  • AIGC application layer faces a crisis of credibility: Hong Kong–listed SaaS stocks’ average price-to-sales (P/S) ratio contracted sharply—from 22x at the start of the year to just 14x—as investors question whether the “AI+” narrative is backed by real paying customers: education-focused apps report monthly active user (MAU) retention rates below 19%; marketing tools see enterprise renewal rates fall below 65%. Notably, three provincial governments have paused Phase II construction of intelligent computing centers under China’s “East Data, West Computing” initiative—citing fiscal constraints and persistently low utilization rates of subsidized compute resources.

Investment Logic Reforged: From “Hoarding Compute” to “Cultivating Use Cases”

This correction does not negate AI’s long-term value—but rather forces the industry into a phase of pragmatic maturity. Future investment themes will shift along three dimensions:
First, valuation anchors will pivot from “parameter count” to “compute-unit economics.” Markets will increasingly focus on hard metrics: model compression efficiency (e.g., QLoRA fine-tuning throughput), inference energy efficiency (watts per token), and enterprise customer lifetime value relative to acquisition cost (LTV/CAC).
Second, infrastructure investment priorities will shift from “general-purpose compute” to “vertical-specific optimization.” Demand for domain-optimized chips—such as medical imaging analysis or industrial defect detection—will accelerate. Energy-efficiency advantages of chips like Cambricon’s MLU590 or Biren’s BR100 in specific workloads may regain pricing power.
Third, application-layer value discovery will refocus on “verifiable ROI.” Policy-backed, compliance-heavy, high-willingness-to-pay verticals—including government AI, financial risk control, and legal document automation—will emerge as safe havens for capital.

OpenAI’s current setback is, in fact, the AI industry’s coming-of-age ceremony. When the noise fades, true competitive moats have never resided in the altitude of stacked chips—but in the depth and precision with which real-world problems are solved. For Chinese investors, chasing conceptual hype is less rewarding than rigorously mapping opportunities across a three-dimensional coordinate system: “certainty of domestic substitution + penetration rate in vertical scenarios + health of cash flow.” After all, in an era where technological faith is being rigorously tested by commercial reality, a company that stays alive will always be worth more than a perfect model that cannot pay its bills.

选择任意文本可快速复制,代码块鼠标悬停可复制

Related Articles

Guangdong Advances Mass Production of Satellites and Rockets to Reshape Global Space Manufacturing Costs

Guangdong Advances Mass Production of Satellites and Rockets to Reshape Global Space Manufacturing Costs

Guangdong has launched a large-scale initiative for commercial space development, leveraging the Pearl River Delta’s advanced manufacturing ecosystem to enable low-cost, high-volume production of satellites and launch vehicles. This push accelerates China’s commercial space sector from technology demonstration to industrial-scale deployment—and could narrow the cost gap with SpaceX, fundamentally reshaping the global aerospace manufacturing cost curve.

OpenAI's Disappointing Earnings Trigger Global Reassessment of AI Infrastructure Investment Logic

OpenAI's Disappointing Earnings Trigger Global Reassessment of AI Infrastructure Investment Logic

OpenAI missed both its FY2025 revenue targets and its goal of 1 billion weekly active users; CFO confirmed no near-term IPO plans and warned of potential breaches in AI compute procurement contracts. Weak user engagement, sluggish monetization, and unsustainable capital expenditures have laid bare the commercialization bottlenecks of large language models—prompting a broad valuation reset across the global AI infrastructure stack.

BOJ Sharply Raises Inflation Forecast to 2.8%, Accelerating Policy Tightening

BOJ Sharply Raises Inflation Forecast to 2.8%, Accelerating Policy Tightening

The Bank of Japan lifted its core CPI forecast for FY2026 from 1.9% to 2.8%—exceeding its 2% target—while downgrading GDP growth expectations; a rare 6–3 vote revealed hawkish dissent, with three policymakers calling for an immediate 25-basis-point rate hike, signaling the irreversible end of ultra-loose monetary policy.

Cover

OpenAI's Disappointing Earnings Trigger Global Reassessment of AI Infrastructure Investment Logic