Claude's explosive popularity has encountered computational bottlenecks! The rise of AI agents is exacerbating resource consumption, prompting Anthropic to tighten usage restrictions during peak hours. This move is expected to affect approximately 7% of users who will hit conversation limits, with Pro version users being the most impacted. This reflects the industry-wide challenge where AI demand far outstrips infrastructure capacity. Meanwhile, Google plans to provide financing support for the data center in Texas that Anthropic intends to lease.
The explosive growth of Claude has encountered a computing power bottleneck. This artificial intelligence company quietly tightened the usage restrictions on its Claude product this week, becoming the latest example of the industry-wide challenge faced by large model manufacturers due to insufficient computing resources.
It is reported that Anthropic adjusted the usage rules for free, Pro, and Max subscription users of Claude this week. While the total weekly limit remains unchanged, users will hit their usage cap more quickly during peak hours—specifically from 5:00 AM to 11:00 AM Pacific Time.
An internal representative from Anthropic confirmed the change on social platform X, acknowledging that approximately 7% of users would encounter previously unseen conversation restrictions, with Pro users being particularly affected.
According to the Financial Times, Google plans to provide financing support for the data center in Texas leased by Anthropic. The deal is expected to be finalized within the coming weeks and will include construction loans extended to the data center operator, Nexus Data Centers.
Peak hour restrictions impact 7% of users
Thariq Shihipar, an employee at Anthropic responsible for the Claude product, posted on X this week to officially explain the background and impact of this adjustment. Shihipar wrote:
"We have implemented a series of efficiency optimizations to alleviate some of the pressure, but about 7% of users will still encounter session restrictions not previously experienced, especially Pro users."
He also suggested that for backend tasks requiring significant token consumption, users should schedule them during off-peak hours to maximize the efficiency of their session quotas.
Shihipar also apologized to users. "I understand this is frustrating," he wrote, "We are continuously investing to achieve efficient scaling, and I will keep you updated on our progress."
This adjustment did not alter the total weekly usage limit; however, the rate-limiting mechanism during peak hours effectively reduced the available computing power within specific time windows, directly impacting heavy users and enterprise workflows.
The AI Agent wave intensifies computational power consumption.
Anthropic's recent computational power shortage is closely related to the rapid popularization of AI Agent applications.
The rise of open-source AI Agent tools represented by OpenClaw has enabled users to harness the full potential of large models in unprecedented ways. However, the explosive growth of Agent applications also means a significant increase in computational power consumption. Behind every automated task chain lies continuous and intensive token calls to the underlying model.
Meanwhile, Anthropic's mainstream attention has been rapidly increasing recently. After CEO Dario Amodei publicly refused to provide unrestricted access to the company’s AI models for the Pentagon, Anthropic's brand exposure significantly increased. Amodei previously stated that enterprise-level business is the company's core strategic direction.
The rapid expansion of the user base and the relatively limited supply of computational power constitute the two sides of the current contradiction.
Industry-wide challenge: Computational power expansion struggles to keep up with demand.
Computational power shortages are not unique to Anthropic but represent a common challenge across the entire cutting-edge AI industry.
OpenAI announced this week the discontinuation of its once-popular AI video generation application Sora, reallocating limited computational resources to focus on core services—this decision, along with Anthropic's peak traffic control measures, jointly reveals an often-overlooked aspect of the current AI arms race: demand growth far outpaces infrastructure expansion capabilities.
Notably, Dario Amodei expressed reservations last December at The New York Times DealBook Summit regarding competitors' heavy investment in super data centers. "I believe there is a certain level of irreducible risk," he said. "I think some players have not adequately managed this risk."
However, the reality of insufficient computational power is putting pressure on Anthropic itself. Tech giants such as Microsoft and Google have invested heavily in AI computational infrastructure. How to strike a balance between prudent expansion and meeting demand will be the core question Anthropic must address moving forward.
Google plans to provide financing support for the Texas data center leased by Anthropic.
Google intends to provide construction loan support for a data center project valued at over 5 billion US dollars, further deepening its strategic partnership with the artificial intelligence startup Anthropic.
According to the Financial Times, citing sources familiar with the matter, Google’s financial support for the project is expected to be finalized in the coming weeks, including providing construction loans to the data center operator Nexus Data Centers.
Meanwhile, multiple banks are competing to provide financing for the first phase of the project before mid-year, with the initial phase potentially exceeding 5 billion US dollars. Sources indicated that Alphabet, the parent company, with its strong credit rating, could help secure financing for the project at a lower cost.
The project, located in Texas and spanning 2,800 acres, represents a key component of the collaboration framework between Google and Anthropic — Anthropic signed a lease agreement with Nexus earlier this month.
Editor/KOKO