Hugging Face CEO Clem Delangue Says AI Isn’t in a Bubble — But LLMs Might Be
Clem Delangue argues the industry’s fixation on LLMs is unsustainable, and a reset may be heading towards correction.
Topics
News
- Pichai Sets the Tone as Google Pushes Gemini 3 Across Its Entire Ecosystem
- Secure.com Launches AI ‘Digital Security Teammates’ to Tackle Human Talent Gap
- Hugging Face CEO Clem Delangue Says AI Isn’t in a Bubble — But LLMs Might Be
- Bezos Returns to an Executive Seat With Stealth AI Venture
- KAUST Startups Top $1 Billion, Signaling Saudi Arabia’s Deep-Tech Breakout
- Apple Begins Mapping Its Post-Cook Future
At a moment when AI dominates corporate strategy and investor focus, open-source platform Hugging Face co-founder and CEO Clem Delangue is offering a contrarian view: the sector’s exuberance isn’t an AI bubble — it’s an LLM bubble. And it may be heading toward correction.
Speaking at an Axios event, Delangue called the current debate around speculation “the trillion-dollar question,” but pushed back against the idea that the wider AI ecosystem is at risk. The problem, he argued, is not the technology as a whole, but the heavy concentration of capital, attention, and expectations around LLMs.
“I think we’re in an LLM bubble, and I think the LLM bubble might be bursting next year,” he said. “But ‘LLM’ is just a subset of AI. When it comes to biology, chemistry, image, audio, or video, we’re still at the beginning of what AI can do.”
The open-source advocate believes the industry will soon move beyond the assumption that one general-purpose model can solve all problems. Instead, he expects adoption to shift toward specialized, domain-specific models that are smaller, cheaper, and easier to deploy.
Certain applications, like a bank’s customer-support bot, do not require a system capable of explaining the meaning of life, but an optimized one that can be achieved without massive LLMs or billion-dollar training cycles.
This shift toward “model multiplicity” could reshape enterprise AI strategies, favoring organizations that can build, fine-tune, or self-host tailored models. For Hugging Face, whose platform hosts thousands of open-source models across modalities, such a transition could be beneficial — though Delangue acknowledges the company isn’t entirely insulated from volatility in the LLM space.
Still, he emphasized that the AI sector is broad enough to absorb a correction in one segment. “Even if LLMs are overvalued, it won’t derail the AI field itself,” he said, pointing to the company’s own financial posture: Hugging Face still has half of the $400 million it has raised in reserve. In today’s capital environment — where LLM developers burn through billions — Delangue calls that level of prudence “AI-standards profitability.”
Having worked in AI for 15 years, he views the current surge in investment and deployment as cyclical. The winners, he suggested, will be the companies that prioritize resilience and long-term impact over speed.
