Creating ChatGPT cost so much that OpenAI posted huge losses in 2022. According to The Information, the company lost $540 million that year. OpenAI's expenses skyrocketed in the months leading up to the AI chatbot's launch, and the company hired key people previously working for Google.
This amount was revealed by three people familiar with the startup's finances. The report shows the huge investment OpenAI had to make to train its machine learning models before it could start selling access to ChatGPT.
OpenAI's revenue is growing, but costs are following
As more customers use AI, costs rise
The company's revenues have increased, reaching an annual rate of several hundred million dollars. However, costs will continue to rise as more customers use the AI technology the company has developed. The more users there are, the more technology infrastructure is needed to keep the platform running.
OpenAI CEO considers raising $100 billion to counter losses
The most capital-intensive bet in Silicon Valley history
OpenAI CEO Sam Altman has privately suggested that they will try to raise as much as $100 billion over the next few years to offset losses. The goal is to develop general artificial intelligence advanced enough to improve its own capabilities, the sources told The Information. They said it would be the most intensive capital investment in Silicon Valley history.
The cost of keeping ChatGPT operational could exceed $700,000 a day
ChatGPT inference costs exceed training costs
OpenAI's massive investment that led to losses last year is just the beginning of the story. Keeping ChatGPT running could cost the company at least $700,000 a day. Most of that money is spent on maintaining its servers, according to semiconductor research firm SemiAnalysis.
Training giant language models like GPT-4, which powers ChatGPT, can require an investment of tens of millions of dollars. However, in the end, the operational costs far exceed that amount.
OpenAI's chatbot uses what AI experts call “inference,” a skill that allows it to calculate responses based on each of its users' instructions. And according to another report published by SemiAnalysis, ChatGPT's inference costs exceed its training costs each week. This estimate is based on the GPT-3 model, so it's likely that the operational cost is even higher now, after the launch of GPT-4.
OpenAI looks to cut costs with paid version of ChatGPT and a partnership with Microsoft
Microsoft deal to speed development, cut costs
OpenAI launched a paid version of ChatGPT last February as one of several strategies to cut costs and limit money lost. But its financial position has been strengthened, primarily, by the multimillion-dollar deal it struck earlier this year with Microsoft.
As part of the alliance, Microsoft began developing an AI chip called Athena. This move will accelerate the development of projects such as OpenAI's ChatGPT and GPT-4 and significantly reduce execution costs. In this project, they have the support of AMD.
OpenAI's revenues are expected to increase significantly this year. According to Reuters, they are expected to reach $200 million. By 2024, they are expected to climb to $1 billion.