Windsurf, the favored vibe coding startup that’s reportedly being acquired by OpenAI, stated Anthropic considerably diminished its first-party entry to the extremely fashionable AI fashions Claude 3.7 Sonnet and Claude 3.5 Sonnet.
Windsurf CEO Varun Mohan stated in a put up on X that Anthropic gave Windsurf little discover for this transformation, and the startup now has to search out different third-party compute suppliers to run Claude AI fashions on its platform.
“We have now been very clear to Anthropic that this isn’t our want – we wished to pay them for the complete capability,” stated Mohan on X. “We’re upset by this determination and brief discover.”
In a weblog put up, Windsurf stated this it has some capability with third-party inference suppliers, however not sufficient, so this transformation might create short-term availability points for Windsurf customers making an attempt to entry Claude.
The choice comes just some weeks after Anthropic appeared to cross over Windsurf in the course of the launch of Claude 4, the corporate’s new household of fashions, which supply trade main efficiency on software program engineering duties. Anthropic gave a number of fashionable vibe coding apps — together with Anysphere’s Cursor, Cognition’s Devin, and Microsoft’s GitHub Copilot — quick entry to run Claude Sonnet 4 and Claude Opus 4. These apps began supporting the brand new Claude 4 fashions on launch day.
Windsurf stated on the time it didn’t obtain direct entry from Anthropic to run Claude 4 on its platform, and nonetheless hasn’t — forcing the corporate to depend on a workaround that’s costlier and complex for developer to entry Claude 4. Anthropic’s AI fashions have change into a favourite amongst builders, and prior to now, Anthropic has labored with Windsurf to energy its vibe coding instruments.
The AI-assisted coding sector, additionally know as vibe coding, has heated up in latest months. OpenAI reportedly closed on a deal to amass Windsurf in April. On the similar time, Anthropic has invested extra in its personal AI-coding purposes. In February, Anthropic launched its personal AI coding software, Claude Code, and in Could, the startup held its first Code with Claude developer convention.
“We’re prioritizing capability for sustainable partnerships that permit us to successfully serve the broader developer group,” stated Anthropic spokesperson Steve Mnich in an e mail to TechCrunch on Tuesday, noting that it’s nonetheless potential to entry Claude 4 on Windsurf through an API key. “Builders can even entry Claude by our direct API integration, our companion ecosystem, and different growth instruments.”
Windsurf has grown rapidly this 12 months, reaching $100 million ARR in April, in an try to meet up with extra fashionable AI coding instruments corresponding to Cursor and GitHub Copilot. Nevertheless, Windsurf’s restricted entry to Anthropic’s fashions could also be stunting its development.
A number of Windsurf customers that spoke with TechCrunch have been annoyed by the dearth of direct entry to Anthropic’s greatest AI coding fashions.
Ronald Mannak, a startup founder that focuses on Apple’s programming language, Swift, instructed TechCrunch that Claude 4 represented a major leap in capabilities for his workloads. Whereas Mannak has been a Windsurf buyer since late 2024, he’s switched to utilizing Cursor in latest weeks in order that he can vibe code extra simply with Claude 4.
As a short-term answer to help Claude 4, Windsurf permits customers to attach their Anthropic API keys to their Windsurf accounts. Nevertheless, builders have famous that this “convey your personal key” answer is costlier and complex than if Windsurf offered the fashions itself.
Relating to vibe coders, optionality is the secret. Each few months, OpenAI, Google, and Anthropic launch new AI fashions that appear to outperform the trade on coding duties. Due to that, it advantages vibe coding startups to help AI fashions from all of the main builders.
Windsurf spokesperson Payal Patel tells TechCrunch through e mail that the corporate has at all times believed in offering optionality for customers. On this case, it appears Anthropic has made {that a} bit more difficult.
{content material}
Supply: {feed_title}