OpenAI, the AI giant behind ChatGPT and DALL-E, is looking to develop its own AI chips as the company’s portfolio of AI products continues to grow and evolve. According to reports, OpenAI might even consider acquiring another company exclusively for this purpose.
As the AI race intensifies with tech giants going toe to toe against each other, companies like OpanAI are facing a serious shortage of chips needed to run powerful AI models.
OpenAI currently relies on GPU-based hardware, largely from Nvidia, to develop AI models like ChatGPT, GPT-4, and DALL-E 3. However, with CEO Sam Altman making the acquisition of more AI chips a top priority for OpenAI, this might change in the future.
OpenAI Exploring Its Chip Options
While the AI giant is yet to move ahead with the manufacturing of its own chips, it is actively evaluating its strategies. The best-performing chips from Nvidia, OpenAI’s primary chip supplier, are already sold out until 2024.
This led the company to consider diversifying its suppliers by collaborating more closely with other chipmakers.
Altman, who is one of the main driving forces behind the decision, has expressed concerns over the rising costs and limited availability of chips.
Hence, it’s no surprise that OpenAI is looking to develop its own chips and might also go the acquisition route to handle chip manufacturing. According to people familiar with OpenAI’s future plans, the company has even gone as far ahead as evaluating a potential acquisition target.
Having raised over $11 billion in venture capital and earned close to $1 billion in annual revenue, OpenAI is undoubtedly in an excellent position to invest heavily in research and development.
According to a report by the Wall Street Journal, the company is also planning a share sale that could cause its secondary-market valuation to rise to $90 billion, which is “roughly triple its level earlier this year”.
Can OpenAI Develop Its Own Chips?
As of now, Nvidia is one of the very few companies that develop specialized AI chips. Also known as AI accelerators, these chips are crucial in training and running the latest generative AI models.
If OpenAI does manage to develop its own AI chips, it would put the company in an elite group of tech giants like Google and Amazon.
OpenAI has been reportedly testing an AI chip named Athena, developed in-house by Microsoft in collaboration with AMD.
Google trains its large AI systems like PaLM-2 and Imagen on its own TPU (Tensor Processing Unit) chip. Amazon, too, offers chips to AWS customers for training and inferencing purposes.
Amazon’s acquisition of Annapurna Labs in 2015 gave it a huge boost, significantly speeding up the process of developing its own AI chips.
However, serious challenges remain in OpenAI’s plan to develop its own AI chips. Last year, AI chipmaker Graphcore lost $1 billion in market valuation after its deal with Microsoft.
Over the last few months, Graphcore has been plagued by falling revenue and increased losses. Habana Labs, an AI chip company owned by Intel, had to lay off 10% of its workforce.
Meta has been struggling to develop its own AI chips, too, to the point that it had to scrap some of its experimental hardware.
With each ChatGPT query costing roughly 4 cents, running the AI chatbot is already an expensive business for OpenAI. An effort to develop its own AI chips may cost the company hundreds of millions of dollars a year.
It remains to be seen if OpenAI’s investors, including Microsoft, which owns 49% of the company’s stakes, would be willing to take such a risk.