Apple's upcoming Worldwide Developer Conference (WWDC) is drawing closer, with expectations high for the unveiling of iOS 18, set to be a focal point of the event. Speculation abounds regarding the integration of advanced AI features into this new software iteration, as Apple aims to catch up with rivals in the realm of artificial intelligence. According to insights from Bloomberg's Mark Gurman, the tech giant is actively developing its own Large Language Model (LLM) to drive on-device AI capabilities for iPhones.
Apple's In-House LLM Development
An essential benefit of utilizing a Large Language Model (LLM) lies in its capacity to operate locally on the device, eliminating reliance on cloud services for task execution. This approach not only ensures data remains secure on the device but also enables quicker response times and enhanced privacy compared to cloud-based alternatives. Leveraging the processing power of iPhones, equipped with dedicated Neural Processing Units (NPUs), Apple's upcoming on-device AI features are poised to function effectively even with limited or no network connectivity.
Integration with Key Applications
Reports suggest that Apple's proprietary LLM, known by the codename Ajax, will be deeply integrated into core applications such as Health, Messages, and Shortcuts. This strategic integration aims to enhance user experience and streamline functionality across various Apple services.
Potential Licensing Agreements
While Apple's in-house LLM model is anticipated to offer robust on-device AI capabilities, there are indications that the company may explore partnerships or licensing agreements with third-party providers. Rumors suggest that Apple could consider licensing technologies like Google's Gemini AI to augment its offerings further. Such collaborations could potentially broaden the scope and capabilities of Apple's AI ecosystem, ensuring a comprehensive and competitive suite of features for iPhone users.