The move, which marks the first time OpenAI has used non-Nvidia chips in a meaningful way, shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centers, potentially boosting Google's tensor processing units (TPUs) as a cheaper alternative to Nvidia's graphics processing units (GPUs), the report said. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee.