
Photo/Zhang Jian (NBD)
At Apple's annual Worldwide Developers Conference (WWDC) held on June 10th, the tech giant unveiled its long-anticipated AI strategy, marking a significant step in its growth narrative. Despite widespread speculation about a potential collaboration with OpenAI, Apple's approach to the AI powerhouse was notably understated.
Sam Altman, CEO of OpenAI, made a discreet appearance at Apple's headquarters but remained officially offstage, contrasting sharply with his high-profile participation at Microsoft's 2024 Build conference. Meanwhile, "Apple Intelligence" took center stage, mentioned approximately 60 times throughout the event. Apple emphasized the capabilities of its intelligent assistant and local AI models, with only three mentions of artificial intelligence.
This juxtaposition raised questions about the nature of "Apple Intelligence" and why a star in the AI field seemed to play a supporting role. Apple Intelligence is designed as a highly personalized and privacy-focused smart assistant, deeply integrated into iOS, macOS, and other operating systems, offering a wide range of functionalities from significant Siri upgrades to writing and communication tools.
The underlying technology is structured in three layers. The first layer consists of Apple's local models, including a 3B parameter language model optimized for small devices like iPhones. The second layer leverages a larger server-based language model accessible through private cloud computing. The third layer integrates third-party large language models, such as OpenAI's ChatGPT, which will be part of Apple's offerings later this year without requiring user accounts.
Apple's strategy suggests a broader ambition to establish its AI ecosystem, with an emphasis on local models and the integration of its chips to enable private cloud computing modes. This move has sparked discussions among tech enthusiasts, hinting at a potential shift in the AI hardware landscape.