Will OpenAI's Competitors Capitalize on OpenAI's Turmoil?
Enterprise AI implications of the crazy OpenAI saga
The news of OpenAI’s turmoil came at a super weird and unfortunate timing for Microsoft:
Microsoft had just finished its Ignite event (along with Github Octoverse), which went extremely well and further solidified the narrative of market leadership by MSFT.
Meanwhile, Google and AWS were taking L’s after L’s.
It was announced that 92% of Fortune500 were already OpenAI customers, indicating the POC “war” was “won” by MSFT.
Google’s Gemini model was announced to be delayed, indicating technical difficulties and issues ahead of GA.
AWS had just launched a pretty underwhelming “make your own GPT” app called PartyRock, indicating continued struggles to play catch up in the GenAI space. This news barely got any traction on media, though it’s unclear AWS itself took PartyRock seriously or not.
In short, just 10 days before AWS re:invent 2023, somehow OpenAI’s board managed to self-inflict FUD with its employees, investors, and partners, as well as the broader AI community.
This post is NOT about speculating what happened and why, but thinking through how OpenAI’s competitors will react. After all, after OpenAI’s dev day, it made plenty of frenemies across the LLM stack by indicating its desire to expand beyond just LLMs and ChatGPT.
The question is, can GCP and AWS - the 2 of Anthropic's largest investors and parters - capitalize on this fiasco and play catch up? Both GCP and AWS have plenty of ways to tweak its messaging and prep its sales meetings to try to capitalize on the uncertainties around OpenAI’s future.
In this post, I’ll quickly share some of my predictions for:
how GCP, Anthropic, and AWS would craft its messaging post-OpenAI fiasco on Nov 17th, and
whether GCP and AWS can actually convert the FUD into sales, and
whether any of this chaos is good for bad for AI startups.
How GCP and AWS’s Messaging May Change
As market laggards, GCP and AWS’s main messaging for its Vertex AI and Bedrock platforms was that it’s a more “private and secure” version of Azure OpenAI. In essence, GCP and AWS couldn’t compete with GPT-4 or even GPT3.5 on performance, so they decided to compete on security and privacy.
Note, this messaging never made any sense, and was even a bit disingenuous, given Azure OpenAI APIs never used customer data and completions as 1P training data. Despite this, the campaign worked somewhat, since it played to risk aversion of enterprise customers.
With the OpenAI fiasco, both GCP and AWS will now shift to emphasizing the “control and stability” narrative: basically, urging customers to have 2nd and 3rd options just in case the turmoil actually impacts the future GPT roadmap in the form of:
key engineers leaving
more turmoil and distractions at the board and C-suite level
OpenAI’s culture of innovation souring, which will curtail OpenAI’s edge over GCP and Anthropic.
This turmoil was the first event that really shed the light on OpenAI’s org structure and flimsiness of governance. Before, every enterprise customer took OpenAI’s trustworthiness for granted, since they were vouched for by Microsoft. Not anymore!
If there’s something that enterprise customers are really allergic to, it’s instability and “bad” image.
Will the new messaging convert into sales?
Unlike the data privacy FUD narrative, this “control and stability” FUD narrative could actually work well, given that large enterprises would rather over-engineer and maintain complexity (e.g. multi-cloud, K8s, etc) than going “all-in” with a single vendor, especially if it has a track record of being unpredictable.
That’s just the sheer reality of enterprise sales - often performance, price, etc, all these things don’t matter as much as perception.
Of course, I’m in the camp that believes OpenAI will be rather fine, and still maintain its technological edge over the much larger competitors at Google, etc. Heck, it’s even possible that OpenAI may have already “achieved” AGI, and is now more than a year ahead of the next best SOTA. If that’s really the case, some of these down-to-earth considerations like enterprise sales discussions may have no meaning.
But assuming that OpenAI isn’t willing to commercialize its AGI (and OpenAI’s by-laws indicate that only the board can determine what’s AGI and what’s not, and it has no obligations to let Microsoft use it), then the rest of the industry should expect:
increased closing of the gap between OpenAI and the rest of field
after all, Gemini will come out in the next 3-4 months (can’t really imagine it being delayed again)
enterprises to discard all plans of going “all-in” into OpenAI.
Impact to AI Startups, and Startups in General
Going against the grain, I think OpenAI’s turmoil is probably BAD for AI startups, especially if it hurts OpenAI’s ability to innovate and pass on cost-savings to APIs that anyone can use, including startups. With other cloud providers, the best and the most optimized models tend to be previewed with the largest customers first, with startups getting a look only after the GA.
It’s popular to knock OpenAI for reasons like 1) in-transparency, 2) being overly competitive (read: productive), etc. But given how many AI startups have some dependency on GPT models, OpenAI is basically the lynchpin of AI startup ecosystem. Without its cost effective APIs, it’s hard to imagine AI startup even being a viable idea, especially given that:
running Llama2 on own GPU instances on GCP or AWS can cost up to 10x - 50x more especially on low utilization.
GCP and AWS have yet to prove that they care about the startup market when it comes to making the best AI models be available at low cost. Most of their sales motions have been to win over the enterprise, who have the budget to reserve their own GPU instances, as well as ML talent to host, finetune, and operate LLM models.
AI startups don’t have such resources. OpenAI’s easy to use APIs have been a great boon for them, and it’s imperative that OpenAI maintains its high bar of API and chat-based products to help the AI ecosystem grow. After all, it’s still a very fledgling ecosystem.