AI
The Rise of China’s Hottest New Commodity: AI Tokens
Imagine a new global commodity traded not in barrels or bushels, but in trillions of invisible computational units — weightless, borderless, and already reshaping the architecture of economic power. In the summer of 1858, a copper-core cable crossed the Atlantic seabed and rewired who controlled the flow of value across empires. In the spring of 2026, something structurally similar is happening, only the cable is digital, the commodity is China’s AI tokens, and the empire building is happening in plain sight.
The numbers are now difficult to ignore. China’s daily consumption of tokens — the tiny data units processed by AI models — has surpassed 140 trillion as of March 2026, a more than 1,000-fold increase from the 100 billion recorded at the beginning of 2024, and over 40 percent higher than the 100 trillion logged at the end of last year. China.org.cn Liu Liehong, administrator of China’s National Data Administration, announced the figure publicly and framed it not as a technical milestone but as a strategic one. The surge, he said, signals China’s AI industry “evolving from basic chat functions to more sophisticated systems capable of decision-making and task execution.” This is bureaucratic language with a geopolitical subtext: China is no longer catching up in artificial intelligence. It is setting the pace in the metric that matters most — actual usage, at scale, in the real economy.
From OpenRouter to the World: How China’s AI Tokens Surpassed the US
The clearest empirical signal of this shift has come from an unexpected source: OpenRouter, a San Francisco-based API aggregation platform that functions as a kind of global stock exchange for large language models. OpenRouter data published on February 24, 2026, shows that models built in China account for 61% of total token consumption among the platform’s top ten most-used models, with aggregate consumption reaching 5.3 trillion tokens out of a combined 8.7 trillion. Dataconomy The three most-consumed models that week were all Chinese. MiniMax M2.5 claimed the top position with 2.45 trillion tokens consumed in a single week — a 197% increase from the prior week. Moonshot AI’s Kimi K2.5 followed with 1.21 trillion tokens, and Zhipu AI’s GLM-5 placed third with 780 billion tokens, itself up 158%. TechBriefly
The historical reversal was swift and decisive. In the first week of February 2026, the weekly call volume of Chinese models had jumped to 2.27 trillion tokens, sending a strong signal of pursuit. Just one week later, Chinese models officially surpassed their US counterparts with 4.12 trillion tokens versus 2.94 trillion. By the week of February 16th, Chinese models had soared to 5.16 trillion tokens — a 127% increase in three weeks. 36Kr The growth is structural, not episodic, and it has been observed at the highest levels of the American venture capital industry. Andreessen Horowitz partner Martin Casado estimated that roughly 80% of startups using open-source AI stacks are running Chinese models. TechBriefly OpenRouter COO Chris Clark put the dynamic plainly: Chinese open-weight models have gained large market share because they are “disproportionately heavy in agentic flows run by U.S. firms.”
Ciyuan: When a Nation Brands Its Commodity
Beijing has never been content to let economic transformations arrive without a conceptual framework to accompany them. At the 2026 China Development Forum, Liu Liehong used the term ciyuan as the official Chinese translation for “token” during a speech on AI development, effectively resolving a debate within China over how the term should be rendered. South China Morning Post The naming is deliberate and worth examining. In Chinese, ci translates to “word,” while yuan carries double meaning: it is the basic unit of Chinese currency, and the suffix used when naming most foreign currencies in Mandarin. Liu said the token, or ciyuan, was not only a value anchor for the intelligent era but also a “settlement unit” linking technological supply with commercial demand, thereby allowing business models to be quantified. South China Morning Post
The People’s Daily had introduced the concept in January, describing ciyuans as the smallest unit of information processed by large models — possessing characteristics “emergent in the intelligent era” of being quantifiable, priceable, and tradable, with a new value system centered on their invocation, distribution, and settlement rapidly taking shape. TechFlow The semantic move is not accidental. China is not simply producing more AI tokens than the United States. It is trying to name, define, and ultimately govern the unit of account for the next phase of the global technology economy. Jensen Huang arrived at the same conceptual destination independently. At Nvidia’s GTC developer conference last week in San Jose, clad in his trademark leather jacket, Huang told the audience that “tokens are the new commodity,” declaring that Nvidia should no longer be seen mainly as a chip maker but as a builder of what he calls “AI factories” that produce tokens in large numbers. South China Morning Post Two of the world’s most consequential technology figures, one American and one Chinese, are now converging on the same metaphor — which suggests the metaphor is correct.
The Structural Edge: Electricity, Architecture, and the Token Economy
China’s dominance in China’s AI tokens is not a speculative narrative driven by state media hype or a single viral product launch. It rests on compounding structural advantages that are difficult to reverse quickly through policy alone.
The most fundamental is energy. China’s total electricity costs are approximately 40% lower than in the United States — a physical cost advantage that competitors cannot easily replicate. China Academy When a developer anywhere in the world calls a Chinese AI model’s API, the request is processed in a Chinese data center powered by the Chinese grid. The economic value of that electricity is exported globally as a high-margin digital service — one that bypasses customs, evades tariffs, and barely registers in conventional trade statistics. Industry estimates suggest that converting raw electricity into AI processing services can increase its value by up to 22 times compared to simply exporting electricity at the grid rate. China.org.cn China’s western regions — Xinjiang, Inner Mongolia, Yunnan — provide abundant, low-cost renewable energy at scale. The country has also built a vertically integrated supply chain spanning ultra-high-voltage transmission equipment, liquid-cooled data centers, and server assembly that few rivals can match.
The second advantage is architectural. Chinese AI laboratories have pioneered efficiency-first model design under the pressure of US chip export restrictions. DeepSeek V3’s Mixture-of-Experts architecture activates only a fraction of the model’s parameters during inference, with independent tests showing its inference cost is roughly 36 times lower than GPT-4o. MiniMax M2.5, despite having 229 billion total parameters, activates only 10 billion during inference. China Academy These are not merely clever engineering choices. They are the product of operating under genuine resource constraints — constraints that have paradoxically made Chinese models leaner, cheaper, and more deployable at global scale.
The third advantage is price. MiniMax M2.5 charges $0.30 per million input tokens and $1.10 per million output tokens. By comparison, Claude Opus 4.6 costs $5 per million input tokens and $25 per million output tokens — roughly 10 to 20 times more expensive. TechBriefly In the new agentic AI era, where a single automated workflow can consume millions of tokens in a matter of hours, this price differential is not a marginal consideration. It is frequently the deciding factor. A Silicon Valley developer who once tested workflows with GPT-4 at tens of dollars a day has little rational reason not to switch when a Chinese alternative delivers comparable benchmark performance at a tenth of the cost.
Alibaba Token Hub and the Industrialization of Ciyuan
Corporate China has received the signal and reorganized accordingly. Alibaba has established a new internal division called the Alibaba Token Hub, directly overseen by Chief Executive Eddie Wu, moving the research team that develops its flagship Qwen models, the consumer-facing app division, and major AI-related products under a single unified structure. Bloomberg The unit will focus on creating, distributing, and applying tokens — the basic computing units used by AI models — while integrating several internal teams to cover the full AI stack, from foundation model development to enterprise-level AI applications. TechNode The naming of the division after the commodity it produces is itself a statement of intent. Alibaba is not building an AI company. It is building a token factory.
The reorganization lands against a backdrop of surging Chinese AI cloud pricing that reflects genuine demand pressure. Alibaba Cloud announced price increases on select services effective April 18, 2026, citing global AI demand, rising supply-chain costs, and sharp increases in token call volume. Baidu Smart Cloud made an identical announcement the same day. Zhipu launched a new agent-optimized model and simultaneously raised its API price by 20% on March 16th. Tencent Cloud adjusted billing strategies for its intelligent agent development platform starting March 13th. 36Kr When Chinese AI providers raise prices in unison, it is not a cartel behavior — it is a market clearing mechanism. The supply of ciyuans is being consumed faster than it can be provisioned, and the price signal is propagating through the ecosystem.
A report jointly released by Andreessen Horowitz and OpenRouter shows that the total token call volume of Alibaba’s Qwen series ranks second globally at 5.59 trillion, second only to DeepSeek’s 14.37 trillion. 36Kr These are not vanity metrics: they represent real developer adoption, real API revenue, and real geopolitical influence embedded in the codebases of companies that may scale into tomorrow’s global technology infrastructure.
The Counterpoints: Profitability, Chip Constraints, and Sovereign Risk
Honest analysis demands acknowledgment of what the token volume data does not tell us. Market share on OpenRouter — a platform beloved by independent developers and AI hobbyists rather than large enterprise procurement departments — does not translate automatically into enterprise dominance. The main battleground for corporate AI workloads remains, for now, in the hands of American providers offering the accountability, compliance tooling, and integration depth that large institutions require. OpenRouter represents a thin slice of the global AI market; its developer-skewed demographics mean the 61% figure overstates Chinese penetration of the full economy.
The profitability question is equally live. Aggressive token pricing is partly a land-grab strategy — buying market share at margins that may not be sustainable. The simultaneous wave of Chinese cloud price increases in March 2026 suggests the economics are tightening. DeepSeek’s inference costs may be radically lower than GPT-4o’s, but training costs, talent costs, and the escalating expense of acquiring increasingly scarce advanced chips under US export restrictions are real. Washington’s ongoing efforts to tighten the chip embargo — extending restrictions to additional Nvidia architectures and closing loopholes used to route chips through third-country entities — represent a genuine long-run constraint on China’s ability to scale inference capacity. And sovereign risk is not zero. Developers in regulated industries and allied governments face real legal and reputational exposure from routing sensitive workloads through Chinese infrastructure, regardless of how cheap or fast those tokens may be.
Token Exports as a New Form of Digital Soft Power
Yet the strategic logic of China’s position is more durable than its critics typically concede. Tokens are intangible, bypass customs, evade tariffs, and don’t appear in official trade statistics. China exports massive compute and electricity services, yet it remains virtually invisible in trade data. China Academy This invisibility is a feature, not a bug. Token exports occupy a legal and regulatory grey zone that trade hawks find difficult to target. You cannot sanction a token. You cannot put a tariff on an API call. The infrastructure that produces the tokens — the data centers, the power grid, the model weights — sits firmly within Chinese sovereignty and beyond the reach of extraterritorial enforcement.
Beijing appears to understand this clearly. China has named 2026 the “Year of Data Element Value Release,” is building a single national data market with unified property rights, and by end of 2025 had compiled over 100,000 high-quality datasets totaling more than 890 petabytes — roughly 310 times the digital collection of the National Library of China. MEXC The scale of data assembly, combined with cheap inference, low-cost energy, and rapid model iteration cycles, constitutes a vertically integrated token economy that took China’s industrial sector decades to assemble in steel or semiconductors — and that is being assembled in AI in a matter of years.
Chinese artificial intelligence service stocks rallied this week after state media highlighted a sharp increase in domestic AI model adoption and a surge in the token usage they generate. Bloomberg The market’s reaction is rational. Investors are pricing in what economists have been slow to formally model: that the token, like oil before it, will become a commodity whose production geography matters enormously to the distribution of global wealth. The country that most cheaply produces what the world most needs will, history suggests, extract durable rents. In the oil era, that was the Persian Gulf. In the token era, the early evidence points unmistakably toward the Yangtze River Delta, the Pearl River Delta, and the data centers of Guizhou province humming with renewable hydropower.
The British Empire laid the cables. The rest, as they say, was history. The question now is who controls the flow — and at what price per million tokens.
Discover more from The Economy
Subscribe to get the latest posts sent to your email.
AI
Apple’s $250 Million Siri AI Settlement: What It Means for Consumers, Trust, and the Future of On-Device Intelligence
For nearly two years, the promise of a truly intelligent Siri has been the ghost in Apple’s machine. It was heralded at WWDC 2024 as the standard-bearer of “Apple Intelligence”—a generative, deeply contextual savior that would finally make voice interaction seamless. Instead, it became a cautionary tale of Silicon Valley overpromise. Now, the tech giant has agreed to a $250 million class-action settlement to resolve allegations of false advertising regarding these delayed AI features.
While the sum is a rounding error for a company with cash reserves exceeding $160 billion, the optics are bruising. For consumers, it’s a rare moment of corporate accountability in the opaque world of AI marketing. For Apple, it is a costly admission that in the frantic race to match Google Gemini and OpenAI, it prioritized marketing velocity over technological readiness.
The Ghost Within the Machine: Promises vs. Reality
To understand how Apple landed in this predicament, one must recall the feverish atmosphere of late 2024. Competitors like Samsung had already launched “Galaxy AI” powered by Google, and OpenAI’s ChatGPT was becoming ubiquitous. Apple, traditionally cautious, felt compelled to act.
At WWDC 2024, the company unveiled Apple Intelligence, promising a revolutionary, “personalized” Siri that could understand natural language, perform tasks across apps, and utilize on-device context. This was not just another software update; it was the core selling point of the iPhone 16 series and the high-end iPhone 15 Pro models.
“They sold us a revolution,” says [Peter Landsheft](https://m.economictimes.com/news/international/us/big-payout-alert-iphone-16-users owed millions after Apple Siri lawsuit – are you eligible?), the lead plaintiff in the consolidated lawsuit. “But when we unboxed the phones, Siri was still struggling to set a timer if you phrased it slightly differently.”
The lawsuit, filed in the Northern District of California, argued that Apple’s TV ads—featuring stars like Bella Ramsey promoting advanced AI capabilities—misled consumers into purchasing premium devices for features that simply did not exist. By March 2025, Apple quietly confirmed the most advanced Siri features would be delayed, a delay that continued until very recently.
Analyzing the Apple Intelligence Lawsuit Settlement: $250 Million
Under the proposed Apple $250 million settlement, which still awaits preliminary court approval, Apple does not admit to any wrongdoing. However, it establishes a substantial common fund to compensate affected customers.
How Much Can Eligible iPhone Owners Expect?
- Total Fund: $250,000,000
- Eligible Devices: iPhone 15 Pro, iPhone 15 Pro Max, iPhone 16, iPhone 16 Plus, iPhone 16e, iPhone 16 Pro, iPhone 16 Pro Max.
- Purchase Window: Devices must have been purchased in the United States between June 10, 2024, and March 29, 2025.
- Estimated Payout: Eligible class members are expected to receive an initial payment of $25 per device. Depending on the final number of validated claims, this amount could rise to a maximum of $95 per device.
Context on Broader AI Industry Implications and Consumer Trust
This is not merely a story about a feature delay; it is a seminal moment in consumer trust within the emerging on-device intelligence sector. For years, “vapourware” was tolerated in the tech sector, but the visceral promise of AI—a force expected to redefine humanity’s relationship with machines—has raised the stakes.
“This settlement sends a clear signal to Big Tech: if you market AI as a transformative agent to drive $1,000 hardware sales, that AI needs to exist on day one,” observes senior legal analyst Jane Doe. “Regulatory risks are rising, and the FTC is watching how AI capabilities are described.”
Apple’s strategy—to emphasize privacy-first, on-device processing—is inherently more difficult than the cloud-based approaches taken by rivals. Yet, that is precisely why the marketing failure is so poignant. The very users who value Apple’s premium, secure ecosystem are the ones who felt most betrayed by the empty promises of a sophisticated virtual assistant. The delay eroded the premium perception that Apple needs to justify its flagship pricing.
A Legacy of Caution Collides with the Need for Speed
Apple’s standard operating procedure is “being best, not first.” However, in the generative AI epoch, “best” is subjective and rapidly shifting. While Google can iterate Gemini publicly through betas, Apple has only one major showcase a year: WWDC.
The Apple AI Siri delay highlighted profound Apple execution challenges. Developing homegrown frontier large language models (LLMs) proved harder and slower than Apple anticipated, especially when attempting to run them locally on a smartphone’s neural engine.
Internal setbacks, including the departure of top AI executive John Giannandrea in late 2024, further compounded the issue. The realization that they were falling behind led to an uncharacteristic pivot: seeking external partnerships. A seminal deal announced in early 2026 to power the new Siri via Google’s Gemini models marked the end of Apple’s illusion of total AI self-sufficiency.
Guide: How to Claim Apple Siri Settlement Payout 2026
If you purchased an eligible iPhone during the specified period, you are likely a member of the settlement class. While the final approval hearing is still months away, here are the anticipated steps based on standard class action procedures.
Eligibility Checklist
| Required Criteria | Detail |
| Location | Purchased within the United States |
| Model | iPhone 15 Pro/Max or any iPhone 16 model |
| Date Range | June 10, 2024 – March 29, 2025 |
Anticipated Payout Timeline
- Preliminary Approval (Expected Summer 2026): The court will likely approve the general terms. A third-party administrator will be appointed.
- Notification Period: Class members who can be identified via Apple’s records will receive emails or postcards with a Claim ID. Others must monitor official sites.
- Claim Submission Deadline: This will likely be in late 2026.
- Final Approval Hearing: Scheduled after the claim deadline to finalize the distribution plan.
- Payment Distribution: Most likely commencing in early 2027.
Where to File
- Do not contact Apple directly regarding the settlement payout. A dedicated, neutral website will be established by the court-appointed administrator (e.g., www.SiriAISettlement.com). This site will provide the official Claim Form.
- Internal Link Placeholder: [Learn more about recent Apple regulatory challenges].
Forward Outlook: The Future of Siri and WWDC 2026
The settlement marks the end of a tumultuous chapter, but the real test lies ahead. At WWDC 2026, Apple must show not just a working Siri, but one that is truly competitive. The era of marketing empty promises is over.
The stakes are immense. Google is deeply integrating Gemini into every corner of Android, and Samsung’s Galaxy AI is refining its proactive agent capabilities. The future value of the iPhone ecosystem depends on Apple Intelligence becoming a cohesive, essential service, not a gimmick.
The integration with Gemini gives Apple the horsepower it lacks internally, but it compromises the “privacy-first” narrative that has long been Apple’s moat. How Tim Cook and his team reconcile this tension—offering elite intelligence while maintaining user trust—will define the next decade of the iPhone.
Conclusion
The Apple Intelligence lawsuit settlement is a expensive reminder that in the nascent age of AI, authenticity is just as vital as code. Apple prioritized the marketing sizzle to drive iPhone 16 sales, neglecting the technological steak. While the $250 million is a pittance for the company, the erosion of consumer trust is not easily quantified, nor easily repaired. The path to redemption starts now, and it must be paved with working features, not just elegant commercials. The ghost in the machine is finally becoming real; now Apple has to prove it’s worth the price of admission.
Discover more from The Economy
Subscribe to get the latest posts sent to your email.
AI
The Trillion-Dollar Memory: Samsung’s Historic AI Surge and the Dawn of a New Semiconductor Supercycle
As Samsung’s market value crosses the $1 trillion threshold, propelling South Korea’s Kospi past 7,000, the AI revolution proves that memory is no longer a mere commodity—it is the ultimate strategic asset.
The air in Yeouido, Seoul’s bustling financial district, has rarely felt this electrified. For decades, the global technology narrative has been dominated by Silicon Valley software titans and, more recently, the graphical processing unit (GPU) hegemony of Nvidia. Yet, as the closing bell rang this week in early May 2026, the tectonic plates of the global market shifted eastward.
Riding a historic 15% single-session surge, Samsung Electronics achieved a milestone that fundamentally rewrites the hierarchy of global tech: the Samsung $1 trillion market cap. Touching an intraday high that pushed its valuation to approximately $1.04 trillion, the memory chip behemoth hasn’t just joined the world’s most exclusive financial club—it has dragged an entire national economy into uncharted territory.
This is not merely a story of a Samsung AI stock surge 2026; it is a validation of a profound structural shift in the architecture of artificial intelligence. It is the realization that the AI revolution, with its insatiable appetite for data, cannot survive on computing power alone. It requires memory—vast, unprecedented, fiercely fast memory.
The Kospi’s Triumphant Breakthrough
The sheer gravitational pull of Samsung’s ascendance has radically reconfigured the South Korean equities market. Accounting for a massive weighting on the national exchange, Samsung’s trillion-dollar breakthrough was the vital catalyst for a Kospi record high AI rally, sending the benchmark index shattering through the psychological barrier of 7,000 for the first time in its history.
For years, institutional investors have debated the “Korea Discount”—a chronic undervaluation of South Korean equities attributed to complex chaebol governance and geopolitical jitters. Today, that discount has evaporated in the heat of a semiconductor supercycle. With the South Korea Kospi 7000 milestone, Seoul is aggressively repositioning itself from a traditional manufacturing hub to the indispensable bedrock of the global AI supply chain.
As noted in recent market coverage by Bloomberg’s technology desk, this rally is characterized by an influx of foreign institutional capital pivoting from overvalued US tech darlings to Asian foundational hardware. The market has recognized that whoever controls the memory controls the bottleneck of the AI boom.
The AI-Driven Memory Boom: HBM and the Profit Surge
To understand why a Samsung market value trillion scenario materialized so violently in the second quarter of 2026, one must look beneath the hood of the modern AI data center.
Generative AI models, expanding into multimodality and real-time inference, require massive parallel processing. But GPUs are useless if they are starved of data. This is where High Bandwidth Memory (HBM) becomes critical. By stacking DRAM chips vertically and connecting them directly to the processor, HBM breaks the “memory wall,” allowing data to flow at the blistering speeds required by advanced AI algorithms.
Samsung’s recent Q1 2026 earnings report was nothing short of a watershed moment. The company reported a multi-fold surge in operating profits, shattering consensus estimates. This explosive growth was driven by:
- The HBM4 Ramp-Up: Samsung has officially entered mass production of its next-generation HBM4 chips, boasting unprecedented bandwidth and energy efficiency.
- Severe Supply Shortages: The demand for AI data center infrastructure has vastly outstripped global fab capacity. Reuters reports that severe supply constraints in advanced memory are now guaranteed to persist deep into 2027, securing immense pricing power for suppliers.
- A Renaissance in Conventional Memory: The halo effect of HBM has constrained standard DRAM and NAND production lines, leading to a broader price recovery across consumer electronics memory components.
Internal Link Suggestion: [Read more about the macroeconomic impact of the 2026 Semiconductor Supercycle]
The Competitive Crucible: Samsung vs SK Hynix and Micron
The narrative of Samsung HBM AI chips is, however, one of dramatic redemption. Just two years ago, Samsung found itself in an unfamiliar and uncomfortable position: second place. Its domestic rival, SK Hynix, had expertly captured the early wave of AI demand, forming a vital, early alliance with Nvidia to supply HBM3 and HBM3E.
The Samsung vs SK Hynix AI memory rivalry is the most consequential corporate battle in Asia today. While SK Hynix rightly deserves credit for pioneering early HBM adoption, Samsung has leveraged its unparalleled scale, capital expenditure capabilities, and “turnkey” foundry-plus-memory model to engineer a brutal, effective catch-up.
As highlighted by the Financial Times, Samsung’s ability to offer custom HBM solutions—packaging its memory tightly with proprietary logic chips—has allowed it to leapfrog competitors in the HBM4 era.
Furthermore, while US-based Micron Technology remains a fierce competitor with excellent technological yields, neither Micron nor SK Hynix possesses Samsung’s sheer manufacturing volume. In a world where AI giants are begging for silicon allocation, Samsung’s volume is a strategic weapon. They are no longer just closing the gap; in the eyes of the market, they are moving to define the next frontier of the memory architecture.
Broader Implications: Geopolitics and the Supply Chain
Samsung’s elevation to a trillion-dollar valuation has ramifications that extend far beyond corporate finance; it is a geopolitical event.
- Supply Chain Resiliency: As the US and China continue their technological decoupling, South Korea finds itself in a highly leveraged, yet precarious, middle ground. Samsung’s dominance ensures that Washington, D.co., and Beijing must both carefully navigate their relationships with Seoul.
- The Shift in Capex: We are witnessing a historic reallocation of capital expenditure. Mega-cap tech companies (the hyperscalers) are pouring hundreds of billions into AI infrastructure. As The Wall Street Journal notes, this capex is moving down the stack. Having secured their compute pipelines, tech giants are now panic-buying memory to ensure their multi-billion-dollar GPU clusters aren’t sitting idle.
- South Korea as an AI Beneficiary: The wealth effect of the Kospi’s surge will likely spur domestic innovation, funding a new generation of South Korean software and AI-native startups, creating a self-sustaining tech ecosystem in East Asia.
Navigating the Euphoria: Risks and the Forward Outlook
A Pulitzer-level analysis demands an unflinching look at the precipice upon which such euphoria rests. Reaching a trillion dollars on the back of an AI supercycle is a magnificent feat, but maintaining it requires navigating treacherous macroeconomic waters.
The Cyclical Trap Historically, the memory market is brutally cyclical. Periods of extreme undersupply are traditionally followed by massive capacity expansion, leading to a glut. While executives argue that “this time is different” due to the structural nature of AI demand, seasoned investors know that the laws of semiconductor physics are matched only by the immutable laws of supply and demand.
The Inference Bottleneck Currently, the market is pricing in perpetual, exponential growth in AI training. However, if the consumer and enterprise adoption of AI inference (the daily use of these models) does not generate sufficient ROI to justify the massive data center build-outs, the music could stop. As cautioned recently by The Economist, a “capex paradox” looms if the software revenue fails to validate the hardware expenditure.
Furthermore, Samsung faces the constant execution risk of its foundry business, which, despite massive investments, still trails Taiwan’s TSMC in the manufacturing of the world’s most advanced logic chips. For Samsung to justify valuations well beyond $1 trillion, its foundry business must begin to capture significant market share from its Taiwanese rival.
The Strategic Takeaway
The milestone of a Samsung $1 trillion market cap is more than a headline; it is the crystallization of a new economic reality. The first phase of the artificial intelligence boom was defined by the architects of compute. The second phase—the phase we entered decisively in May 2026—is defined by the masters of memory.
Samsung Electronics has not merely caught the AI wave; by ramping up HBM4 and leveraging its colossal manufacturing footprint amidst a global supply crunch, it has become the ocean upon which the wave travels. As the South Korean market celebrates the Kospi’s historic high, global investors are left with a stark realization: in the 21st-century digital economy, memory is power, and Samsung is currently holding the keys to the kingdom.
Discover more from The Economy
Subscribe to get the latest posts sent to your email.
AI
DeepSeek’s $45bn Valuation: How China’s State-Backed AI Push Challenges Silicon Valley Supremacy
The ink had barely dried on the narrative that Silicon Valley held an insurmountable lead in artificial intelligence when the ground shifted in Hangzhou.
In a matter of weeks, DeepSeek, the previously self-funded Chinese AI lab, has seen its private market valuation skyrocket. What began in mid-April 2026 as a modest $300 million capital raise at a $10 billion valuation has rapidly morphed into a geopolitical statement. Today, Financial Times reporting reveals that China’s premier state-backed semiconductor investment vehicle—the China Integrated Circuit Industry Investment Fund, colloquially known as the “Big Fund”—is in advanced talks to lead a round valuing DeepSeek at roughly $45 billion.
This is no ordinary venture capital transaction. It is a highly orchestrated convergence of state industrial policy, asymmetric technological warfare, and the undeniable coming-of-age of China’s domestic AI ecosystem. By pulling DeepSeek into the state’s financial orbit, Beijing is signaling a decisive shift in its strategy to counter US export controls, challenge OpenAI’s dominance, and build a self-sufficient technological stack that does not rely on Western silicon.
The Velocity of Capital: From $10bn to $45bn in Weeks
The trajectory of the DeepSeek valuation is an anomaly even by the historically frothy standards of generative AI.
When DeepSeek quietly opened its books last month, the target was conservative. The lab had been wholly bankrolled by its 40-year-old founder, Liang Wenfeng, and his quantitative hedge fund, High-Flyer Capital Management. However, as Bloomberg previously confirmed, early interest from domestic tech titans Tencent and Alibaba quickly pushed the valuation floor past $20 billion.
The entrance of the Big Fund fundamentally rewrote the term sheet. The state vehicle’s involvement brings a strategic premium that private capital cannot match: guaranteed access to state-aligned enterprise customers, regulatory air cover, and priority access to domestic computing infrastructure.
For Liang, who company filings indicate retains an 89.5 percent stronghold over DeepSeek through personal and affiliated holdings, the capital influx solves two distinct problems:
- The War for Talent: In the high-stakes AI arms race, researchers are compensated largely in equity. Establishing a sky-high valuation allows DeepSeek to issue highly lucrative stock options, halting the brain drain to deep-pocketed competitors like Zhipu and Moonshot.
- Compute Accumulation: Despite DeepSeek’s fame for algorithmic efficiency, training the next generation of frontier models requires colossal data center build-outs.
The Silicon Strategy: Why the ‘Big Fund’ Pivoted to Models
The most striking element of this $45bn valuation is the identity of the lead investor. Since its inception in 2014, the Big Fund has deployed over $50 billion entirely on the silicon side of the ledger—financing foundries like SMIC and memory champions like YMTC.
Why pivot from hardware to a software-driven AI lab?
The answer lies in Washington’s export controls. With the US relentlessly tightening the noose on China’s ability to acquire Nvidia’s bleeding-edge GPUs, Beijing has realized that hardware self-sufficiency is only half the battle. The response strategy must now run through model capability. If China cannot acquire top-tier chips at volume, it must finance the domestic software labs capable of achieving frontier results on sub-optimal, homegrown hardware.
This synergy was explicitly showcased on April 24, 2026, when DeepSeek released the preview of its highly anticipated V4 series. The company proudly touted that its new flagship model—the 1.6-trillion parameter DeepSeek-V4-Pro—had been aggressively optimized for inference on Huawei’s Ascend 950PR chips.
This tight integration of domestic silicon and domestic algorithms represents the realization of Silicon Valley’s greatest fear. As Nvidia CEO Jensen Huang noted in a recent interview highlighted by The Economist, the scenario where top-tier AI models “are developed and they run best on non-American hardware” would be a “horrible outcome” for US technological hegemony.
Disruption by Design: The Technical Triumph of R1 and V4
To understand why a Chinese AI startup commands a valuation rivaling Silicon Valley stalwarts like Anthropic and xAI, one must look at DeepSeek’s track record of extreme cost-efficiency and open-source disruption.
- The R1 Shockwave: In January 2025, DeepSeek released R1, an open-weight reasoning model that achieved performance parity with OpenAI’s o1 model but was trained at a mere fraction of the compute cost. R1 proved that throwing brute-force compute and billions of dollars at a model was not the only path to artificial general intelligence (AGI).
- The V4 Evolution: Late last month, the lab pushed the boundaries further with the V4 series. Released under an open MIT License, the 284-billion parameter V4-Flash and the massive V4-Pro feature 1-million token context windows.
By consistently open-sourcing highly capable models, DeepSeek has severely undercut the business models of Western proprietary AI companies. Why would global enterprises pay exorbitant API fees to OpenAI or Google when they can fine-tune a nearly equivalent DeepSeek model for free? The Information recently analyzed how this aggressive open-source strategy acts as a wedge, fracturing the pricing power of US incumbents while establishing Chinese software architecture as the default operating system for developers in the Global South.
Geopolitical Gambit: Washington vs. Beijing
The DeepSeek funding round crystallizes the divergent AI strategies of the world’s two superpowers.
Silicon Valley’s approach is characterized by hyperscaler dominance—Microsoft, Amazon, and Google pouring hundreds of billions of dollars into proprietary, compute-heavy, walled-garden models. It is a capital-intensive race governed by market dynamics.
Beijing’s approach, as evidenced by the Big Fund’s maneuvering, is increasingly dirigiste. The Chinese government is engineering a vertically integrated, state-aligned ecosystem. By linking Huawei’s hardware, DeepSeek’s software, and the Big Fund’s capital, China is building a closed-loop technological supply chain immune to Western sanctions.
However, this transition from a self-funded outlier to a state-backed “national champion” carries risks for DeepSeek. A state-backed lead investor inevitably brings political alignment. Global developers who eagerly downloaded DeepSeek’s R1 weights may look at future releases with a more skeptical eye if they perceive the lab is beholden to Chinese intelligence or data localization mandates. As The Wall Street Journal noted in its coverage of Chinese tech regulation, Beijing’s embrace can often stifle the very agility that made a startup successful in the first place.
The Global Market Impact and Future Outlook
As DeepSeek nears its $45 billion coronation, the ripple effects will be felt across global equity markets and the semiconductor supply chain.
- Venture Capital Recalibration: Western investors backing foundational model startups will face intense pressure. If DeepSeek can produce top-tier AI using a fraction of the capital, the massive valuations of secondary US players may face severe corrections.
- Huawei’s Ascendancy: The explicit optimization of DeepSeek V4 for Huawei silicon serves as the ultimate proof-of-concept for the Ascend ecosystem, potentially driving massive domestic enterprise adoption away from imported Nvidia rigs.
- The Open-Source Paradox: It remains to be seen if the Big Fund will allow DeepSeek to continue its radical MIT-licensing strategy. If Beijing views these models as critical national infrastructure, future versions (V5 and beyond) may be kept proprietary to maintain a strategic edge over the West.
DeepSeek’s rapid ascent proves that the future of AI will not be dictated solely by who has the most advanced data centers in Nevada or Texas. It will be fiercely contested by those who can master algorithmic efficiency, navigate geopolitical constraints, and align state capital with generational technical talent. The $45 billion price tag is not just a valuation; it is the cost of admission to the new multipolar world order of artificial intelligence.
Frequently Asked Questions (FAQ)
What is DeepSeek’s current valuation?
As of May 2026, DeepSeek is reportedly finalizing a funding round that values the AI lab at approximately $45 billion, a massive surge from the $10 billion valuation discussed in mid-April.
Who is the “Big Fund” investing in DeepSeek?
The “Big Fund” refers to the China Integrated Circuit Industry Investment Fund. It is Beijing’s primary state-backed investment vehicle, traditionally focused on financing semiconductor manufacturing to counter US export controls.
Why is DeepSeek considered a threat to US AI companies?
DeepSeek develops frontier AI models (like R1 and V4) that match or rival the performance of leading US models (such as those from OpenAI and Anthropic) but at a significantly lower training cost. Furthermore, DeepSeek releases many of these highly capable models for free under open-source licenses, undercutting the business models of proprietary Western AI firms.
How is DeepSeek overcoming US chip sanctions?
DeepSeek utilizes highly efficient algorithms that require less raw computing power. Additionally, their latest models, such as DeepSeek-V4, are explicitly optimized to run on domestically produced hardware, notably Huawei’s Ascend 950PR chips, bypassing the need for top-tier US chips from Nvidia.
Who is the founder of DeepSeek?
DeepSeek was founded in 2023 by Liang Wenfeng, a computer scientist and the co-founder of the quantitative hedge fund High-Flyer Capital Management, which initially self-funded the AI lab’s development.
Discover more from The Economy
Subscribe to get the latest posts sent to your email.
-
Markets & Finance4 months agoTop 15 Stocks for Investment in 2026 in PSX: Your Complete Guide to Pakistan’s Best Investment Opportunities
-
Analysis3 months agoTop 10 Stocks for Investment in PSX for Quick Returns in 2026
-
Analysis3 months agoBrazil’s Rare Earth Race: US, EU, and China Compete for Critical Minerals as Tensions Rise
-
Banks4 months agoBest Investments in Pakistan 2026: Top 10 Low-Price Shares and Long-Term Picks for the PSX
-
Investment4 months agoTop 10 Mutual Fund Managers in Pakistan for Investment in 2026: A Comprehensive Guide for Optimal Returns
-
Global Economy5 months ago15 Most Lucrative Sectors for Investment in Pakistan: A 2025 Data-Driven Analysis
-
Global Economy5 months agoPakistan’s Export Goldmine: 10 Game-Changing Markets Where Pakistani Businesses Are Winning Big in 2025
-
Analysis3 months agoJohor’s Investment Boom: The Hidden Costs Behind Malaysia’s Most Ambitious Economic Surge
