iPhone 16 to Reportedly Feature Generative AI Capabilities

In a silent yet strategic move, Apple is intensifying efforts to integrate generative artificial intelligence (AI) capabilities into its iPhone 16 series releasing this year, Financial Times is reporting.

IPhone 16 Pro Perspective Feature

Apple’s focus on addressing the technological hurdles of implementing AI on iPhones is evident in its acquisition spree, securing 21 AI startups since 2017. The most recent addition, WaveOne, specializes in AI-powered video compression.

Industry experts predict significant mergers and acquisitions in Apple’s AI endeavors, highlighting the company’s commitment to staying competitive in the ongoing AI arms race.

Apple has been discreet about its AI plans, but recent job postings reveal a shift toward deep learning – the algorithms behind generative AI. The company has hired top AI executive John Giannandrea from Google and is actively working on large language models similar to OpenAI’s ChatGPT.

The goal is to operate generative AI directly through mobile devices, enabling AI applications and chatbots to run on iPhone hardware and software without reliance on cloud services.

To tackle this challenge, Apple is working on reducing the size of large language models and enhancing processors for higher performance.

The company’s commitment to AI is further evident in chips like the S9 for the Apple Watch and the A17 Pro for the iPhone 15, both equipped with enhanced capabilities for running generative AI.

Siri 1

While competitors like Samsung and Google have already released devices with generative AI features, Apple is gearing up for a significant reveal at its Worldwide Developers Conference in June.

Anticipated as iOS 18, the new operating system is expected to enable generative AI, with speculations about Siri being powered by a large language model. Apple has introduced new chips, such as the M3 Max processor for the MacBook, designed to handle AI workflows previously impossible on laptops.

Apple researchers recently achieved a breakthrough by running large language models on-device using Flash memory, ensuring faster query processing, even offline.

The company has also collaborated with Columbia University on an open-source large language model called “Ferret,” connecting language to the real world.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.