Here are Apple Intelligence Supported Devices

Apple WWDC24 Apple Intelligence hero 240610 big jpg large

Apple Intelligence is the company’s new AI push onto Mac, iPad and iPhone, announced today at WWDC.

However, Apple Intelligence will not be available for all devices. Apple says the feature will be free to use and will first launch in U.S. English. The beta for Apple Intelligence will come this fall as part of iOS 18, iPadOS 18, and macOS Sequoia.

Here is a list of devices that will support Apple Intelligence, according to Apple’s website:

iPhone

iPad

Mac

This really sucks for owners of an iPhone 14 Pro or iPhone 14 Pro Max, as they are left out (time to upgrade). Let’s wait and see what AI features these older devices will get, stay tuned.

Want to see more of our stories on Google?

Add iPhone in Canada as a Preferred Source on Google

P.S. Want to keep this site truly independent? Support us by buying us a beer, treating us to a coffee, or shopping through Amazon here. Links in this post are affiliate links, so we earn a tiny commission at no charge to you. Thanks for supporting independent Canadian media!

Subscribe
Notify of
guest
12 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Steve
Steve
1 year ago

This is a big push for Apple to reduce upgrade times.

g-man
g-man
1 year ago

NPU equivalencies:

A14 = M1
A15 = M2
A16 = M3

so I can’t wait to hear the rationale for leaving out the above A chips from Artificial Apple Intelligence.

escargot
escargot
Reply to  g-man
1 year ago

You really don't know? Those AI features require large amounts of RAM (at last 8GB) and only the iPhone 15 Pro has 8GB of RAM. Whereas all M-series chips have 8GB RAM. Nice try though to create a scandal.

Commentz123
Commentz123
Reply to  escargot
1 year ago

why are you so eager to defend a company that has been caught several times with their pants down?

escargot
escargot
Reply to  Commentz123
1 year ago

Why are you so eager to spread misinformation and lies? If you’re going to come for them, at least come correct SMH.

Commentz123
Commentz123
Reply to  escargot
1 year ago

you guzzled down what apple says without hesitation then assume that has to be the truth. the same people that previously tried to limit stage manager on ipad to m1 only due to “limitations” only to backtrack when they got busted. I dont know if I should laugh at your childlike naivety or feel sad.

g-man
g-man
Reply to  escargot
1 year ago

If the model used 8GB of RAM there wouldn’t be anything left for the system and apps so clearly the models are smaller than that. They could make even smaller quantized versions also available for devices with less RAM or devices – like 8GB Macs – which are already perilously low on RAM.

Also clearly they knew they would be rolling this out soon so why TF didn’t they throw more ram in the 14 Pro?

escargot
escargot
Reply to  g-man
1 year ago

The LLM AI features require a device with 8GB RAM. Because yes a device with 6 or 4 GB (like all iPhones but the 15 Pro) wouldn’t have enough for both the LLM plus the system and apps.

The A14, 15, 16 simply don’t have enough RAM, unlike their M-series cousins. There are also questions like power draw, memory bandwidth etc. The M chips are a lot more souped up than the A series ones.

g-man
g-man
Reply to  escargot
1 year ago

I already spoke to these points. The 14 Pro should have shipped with 8GB since they knew this was coming, but even 6GB devices could use smaller quantized models.

And the A15 and 16 have faster NPUs than the M1.

Blargh
Blargh
Reply to  g-man
1 year ago

Here you go: There seems to be a clear performance and resource requirement for their on device processing. An A14 is nowhere near an M1 in multicore performance. An A16 is closer but unlike the A17 Pro and M1, never shipped with 8 gigs of ram. The neural engine is not doing all the work.

I imagine you missed out on all the accusations of planned obsolesce back when they shoved full fat upgrades of iOS onto older devices and tanked their performance. Ars ran a few articles on that. Could they put the on device AI stuff on a 14 Pro? Sure, maybe? If they’re cool with how it performs and whatever effort’s required to make it work, maybe they’ll see some of it.

As for the next iPad mini, I’m confident it’ll support all of it because if it didn’t, that would be dumb.

g-man
g-man
Reply to  Blargh
1 year ago

Citation that inference will be running on the CPU or GPU?

More likely it’s the RAM that’s the issue but various quantizations could be made available.

SOB
SOB
1 year ago

Hoping the next iPad mini will make this list.

12
0
Would love your thoughts, please comment.x
()
x