Apple has taken a unique approach to developing its artificial intelligence (AI) infrastructure by utilizing Google’s tensor processing units (TPUs) instead of Nvidia’s GPUs, as revealed in a recent research paper published by Apple. This decision marks a significant move in the tech industry landscape, especially considering Nvidia’s dominant position in the AI chip market.
In the research paper, Apple detailed how it used two types of Google’s TPUs in extensive chip clusters to train its AI models for iPhones and other devices. For iPhone models, Apple employed 2,048 TPUv5p chips, while for server-based AI models, they utilized 8,192 TPUv4 processors.
Unlike Nvidia, which sells its GPUs and systems as standalone products, Google offers access to its TPUs through the Google Cloud Platform. This means that customers need to develop software within Google’s ecosystem to utilize these powerful processors effectively.
The paper also hinted at the potential for further collaboration between Apple and Google to develop even more sophisticated AI models using TPUs. This long-term partnership could lead to groundbreaking advancements in AI technology.
Coinciding with the release of this research paper, Apple introduced new AI features to beta users this week. These features include integrating OpenAI’s ChatGPT technology into Apple’s software, enhancing user experience and functionality.
This disclosure about Apple’s use of Google’s TPUs comes after initial reports by Reuters in June hinted at it but did not provide details on the full extent. By sharing this information in their research paper, Apple is shedding light on their strategic decisions and partnerships in the field of artificial intelligence.