OpenAI Shifts to Google's AI Chips to Enhance Product Offerings
OpenAI is adopting Google's AI chips to power its products, marking a significant strategic shift in hardware sourcing.
Key Points
- • OpenAI is now renting Google's AI chips to support ChatGPT and other products.
- • This is OpenAI's first major use of non-Nvidia chips, indicating a shift in hardware strategy.
- • The collaboration between OpenAI and Google highlights a competitive alliance in the AI field.
- • Google is making its TPUs available externally, which were previously restricted to internal use.
OpenAI has embarked on a significant transition by renting Google's tensor processing units (TPUs) to power its AI products, including ChatGPT. This move marks OpenAI's first substantial adoption of hardware outside of Nvidia's graphics processing units (GPUs), reflecting a strategic shift in its computing infrastructure to meet increasing demand for processing power. The collaboration is particularly remarkable as it occurs between two competitors in the AI sector, illustrating a competitive alliance unlikely prior to this announcement.
According to a source speaking to Reuters, the partnership details OpenAI's efforts to lower inference costs through the utilization of Google Cloud services. Historically, Google reserved its TPUs primarily for internal use, but it is now expanding their availability to prominent tech companies, including OpenAI, alongside others like Apple and AI startups founded by former OpenAI leaders. This diversification indicates that OpenAI is attempting to reduce its dependency on its existing partnership with Microsoft for cloud infrastructure.