After being fired by the company he co-founded, hired by Microsoft and then returning as OpenAI chief, Sam Altman is again facing a crisis. His ongoing stand-off with the world’s richest man, Elon Musk, has taken another turn after Musk and a bunch of like-minded investors announced they would be putting in a bid of $97bn to acquire OpenAI.
Responding to the bid, Altman tweeted: “No thank you, but we will buy Twitter for $9.74bn if you want.”
Last year, Musk filed a complaint in San Francisco Superior Court alleging that OpenAI CEO and president Greg Brockman breached the founding agreement of the creation of OpenAI.
The company was originally founded as a not-for-profit organisation, but to purchase the compute capacity it needed, it reorganised to create a for-profit business.
The company received $10bn of support from Microsoft, which included access to the Microsoft cloud, for running OpenAI large language models (LLMs) such as ChatGPT.
In January, the Trump administration announced Project Stargate. Supported by tech giants including Oracle, it’s run as a new company that intends to invest $500bn over the next four years in building AI infrastructure for OpenAI in the US.
In a blog posted on Monday, covering how humanity will need to adapt to the era of artificial general intelligence, where machines are able to tackle cognitive tasks equivalent to humans, Altman affirmed his commitment to the Microsoft partnership, writing: “We do not intend to alter or interpret the definitions and processes that define our relationship with Microsoft. We fully expect to be partnered with Microsoft for the long term.”
However, OpenAI’s position as the leader in LLM development has been put into jeopardy following the release of China’s DeepSeek AI model, which massively undercuts what OpenAI charges.
While lawmakers are trying to curb DeepSeek, due to user data potentially being shared with China, the DeepSeek models are open source, which means they can run privately and on any public cloud infrastructure. In fact, Amazon Web Services, Azure and Google Cloud Platform all offer the DeepSeek R1 model.
Costs vary depending on the graphics processor unit required, but the cheapest way to run the LLM is via application programming interfaces (APIs), which connect directly to DeepSeek’s own cloud version of its LLM.
While OpenAI charges $2.50 per million input tokens for its GPT-4o model, directly connecting to DeepSeek through an API is priced at $0.14 per million input tokens in situations where the AI engine is able to draw on previously cached information. Non-cached inputs are priced at $0.55 per million tokens.
Arguably, the existence of a model that can be run far cheaper than OpenAI may have some investors spooked. Certainly, the stock market and Nvidia’s share price crashed after DeepSeek’s announcement. The fact that Musk has put in a bid for OpenAI, just weeks after the availability of the new DeepSeek LLM, may well be the tech billionaire’s attempt to capitalise on the market confusion and potential readjustment as people begin to understand there is more than one way to do AI.
There are questions over Musk’s x.AI business and its Grok LLM. The company recently closed a Series C funding round of $6bn, but now Musk and a band of investors are looking to acquire OpenAI. Clearly, should Musk be successful in his bid, he will be at the centre of the US AI strategy, and the $500bn Stargate initiative.