The artificial intelligence market is entering a phase of critical revaluation. After a period of unrestrained euphoria — during which investors reflexively bought any asset associated with the “AI” label — Wall Street is beginning to experience more pragmatic anxieties. Prices for the shares of the main beneficiaries of this technological boom show signs of exhaustion. Simultaneously, massive capital expenditures (capex) of Big Tech companies have forced analysts to model scenarios in which the current business model may misfire.
The fundamental mistake which market participants can make right now is perceiving AI as a monolithic essence. In reality, AI represents a complex, multi-layered ecosystem. Risks that are currently being priced into shares by no means suggest “the end of neural networks” is coming, but rather reflect a concern regarding a classic crisis of overproduction inside one specific segment: the cloud computing market.
The reality is that to understand the mechanics of these fears and how they could change the hierarchy of technological giants, we need to analyze the logic which governs supply, demand, and the monetization of computing power.
Anatomy of the Boom and the Trap of the ‘Middleman’
The genesis of the AI boom showed how breakthrough large language models (LLMs) provoked a “Cambrian explosion” among developers. Thousands of startups and corporations allocated budgets for the creation of their own applications, AI agents, and specialized tools. The training and launch (inference) of these algorithms required a colossal volume of compute.
This created an unprecedented demand for data center infrastructure. Cloud giants found themselves in the position of sellers of shovels, and server rent became the main driver of forecast revenue. In essence, a classic B2B market formed: businesses sell computing power to other businesses, and this is often financed by venture capital or corporate innovation budgets.
But here is a nuance: investors begin to identify a potential bottleneck: the final consumer (B2C). Developer companies — the so-called “middlemen” — spend huge funds on cloud resources in the hope that users will massively buy AI subscriptions. At the same time, consumer budgets have their limits. If the average revenue per user (ARPU) for AI services does not demonstrate explosive growth comparable with the investments into infrastructure, the unit economics for these startups will not be viable.
In a scenario where thousands of B2B applications collide with growing cloud expenses while the consumer return lags behind, the natural reaction could become aggressive optimization. Developers may be forced to sharply cut costs and decrease the consumption of compute, which potentially can lead to the demand for raw computing power hitting the hard ceiling of consumer solvency.
Infrastructural Overhang and the Threat of Surplus
While the market values the risks of demand at the application level, the massive flywheel of infrastructural investments continues to spin. The construction of data centers is an inertial process which takes years. Capacities planned by cloud providers at the beginning of 2025 — based on the extrapolation of “endless demand” — are only right now being put into service.
The market fears the classic effect of scissors: what will occur if a massive volume of new computing power exits onto the market exactly at that moment when cloud tenants begin to tighten their belts and optimize consumption? The era of deficit can very fast transition into an era of surplus. As in any commodity market, if supply exceeds demand, the seller loses pricing power. The seller’s market could transform into a buyer’s market.
This absolutely does not mean that AI technology went into a dead end. Rather, this describes a mechanism under which the sale of raw computing power to third-party developers ceases to be a guarantee of eternal margin growth. The understanding of this mechanism allows us to conduct a more refined valuation of the main players.
The Great Divergence: Who Is in the Best Position?
If we accept a scenario under which the cloud rent market can collide with a crisis of overproduction, it becomes obvious that Big Tech companies are subjected to this risk to varying degrees. Success may depend on how much a company relies on external demand compared to its capability to redirect compute into its own ecosystem.
Amazon: The Vulnerability of the ‘Digital Landlord’
Among the “Big Three,” Amazon’s (AMZN) business model looks the most sensitive to this scenario. The success of AWS is historically tied to the rent of capacities. They operate as a digital rentier. If startups collide with insolvency or corporations cut AI budgets, Amazon may experience difficulties with the utilization of its new server capacities. In distinction from competitors, Amazon does not have an equally huge internal ecosystem of consumer software to seamlessly absorb surplus teraflops. Moreover, if external B2B clients refuse to pay a premium for infrastructure, the massive capex could lay as a heavy burden on profitability.
Microsoft: The Hybrid Cushion
Microsoft (MSFT) possesses a more stable architecture. Although Azure may also suffer from a slowdown in demand from third-party developers, Microsoft has a buffer in the form of vertical integration into corporate software. The company can utilize its infrastructure by integrating AI tools (for example, Copilot) into key products, such as Windows, Office 365, and GitHub. This hybrid approach allows Microsoft to sell productivity instruments rather than simply raw computing power, smoothing out potential volatility in the rental model.
Alphabet: The Advantage of the ‘Closed Cycle’
In a hypothetical surplus of compute, Alphabet’s (GOOG) (GOOGL) model looks the least vulnerable. Google’s infrastructure was originally built with a strong focus on internal needs. This is a closed-cycle company with direct access to billions of users through Search and YouTube. Even if external demand for Google Cloud sags, computing power will not stand idle. Google’s algorithms can redirect these resources to internal tasks: improving ad targeting, optimizing search results, or developing consumer services. For Google, server capacities are raw materials which it can turn into highly marginal products themselves, lowering the dependence on the B2B developer as a middleman.
Why Computing Power Is Not the Whole ‘AI’
Considering the risks of surplus capacities, investors should avoid one more mistake: equating the data center market to the AI market as a whole. Cloud computing is just one channel of monetization for AI, although it has been the most discussed in recent years.
The fact that the infrastructural segment could collide with ROI problems absolutely does not mean that the technology itself stalls. The monetization of AI is multi-faceted. For example, while the infrastructural market worries due to an overabundance of servers, the direct-to-consumer segment (such as ChatGPT from OpenAI or Gemini from Google) remains very strong. This segment reflects stable demand from users ready to pay for direct access to advanced models.
Moreover, we are currently located only at the early stages of implementation. The near future may bring completely different markets and monetization strategies, from new formats of personalized advertising to autonomous e-commerce agents. Artificial intelligence is not one market; it is a multitude of parallel and intersecting directions.
Conclusion for the Investor: In Search of Stable Monetization
The period of blind faith in the endless growth of all assets associated with AI yields its place to a phase of harsh business model audits. The raw computing power market, which served as the main engine of the initial hype, could collide with structural saturation and rental deflation.
In this environment, the focus for investors must shift. The critical question is no longer who can buy the most chips, but which monetization strategies appear the most viable. Winners will likely be companies capable of stepping beyond the “landlord” framework to create products for the end user, as well as those whose diversification allows them to be pioneers in new AI applications.
Investors should attentively follow the evolution of monetization technologies, understanding that in the coming years, the capability to effectively sell AI solutions — and not simply the capacity for their creation — will determine the new leaders of the technology sector.
On the date of publication, Mikhail Fedorov did not have (either directly or indirectly) positions in any of the securities mentioned in this article. All information and data in this article is solely for informational purposes. For more information please view the Barchart Disclosure Policy here.