Apple’s new artificial intelligence leaves out more than 90% of current iPhone users | Technology

0
94

After almost two years of waiting, Apple has finally gotten on the generative artificial intelligence (AI) train, with the great news announced by the company’s CEO, Tim Cook, at the presentation of its World Developers Conference (WWDC). ). In Cook’s own words at the start of his speech, “I am excited about the new and profound capabilities that we are going to present, which we hope will make our devices more intelligent and useful than ever.” However, the details that Apple later gave in the keynote and on their website they limit and clarify the scope of this technological leap, which will only be available to current iPhone users who have one of the two most powerful models – the 15 Pro and the 15 Pro Max -, and that in 2024 They will only reach those who speak and write in US English.

The new Apple Intelligence system and the expected deep renovation of Siri – which will arrive in the fall, and in the testing phase, with the new iOS 18 operating system – will leave aside well over 90% of current iPhone users, if not They buy a new smartphone. In the absence of official data from the technology company itself, estimates indicate that there are about 1.5 billion active iPhone users worldwide; Different analysts estimate that the iPhone 15 Pro and Pro Max—the best-selling in the world—could already be close to 100 million units. That would not reach, therefore, even 7% of the total users of Apple mobile phones; and from that percentage we would have to exclude, for the moment, the non-American public.

Of these two iPhones, of the 24 models compatible with the new iOS 18, they will be able to execute Apple Intelligence functions, such as summarizing all types of documents and notes, suggesting responses to messages, correcting the style of texts, transcribing phone calls—functions that Other technology giants are already offering—or even create new emojis and images from scratch with a few simple instructions. The rest of the iPhones will also not have access to the renewed Siri, which will be able to understand requests much better, maintain the thread of a conversation with the user or ask ChatGPT to resolve any questions outside of its reach. The explanation that Apple gave in its presentation on Monday is that in order to add this “personal intelligence”—as the company calls it, instead of artificial—and maintain privacy, immediacy and simplicity of use, computing must be done on the phone itself; and only models with A17 Pro processors (or higher) are capable of running the large language models (LLM) that power generative AI.

Beyond phones, Apple Intelligence and the new Siri will also reach tablets and computers that have Apple’s own processors —M1 or higher. In the case of iPads, this will benefit five of the 15 models compatible with the iPadOS 18 operating system; while 13 of the 18 computer models compatible with macOS 15 Sequoia will be able to fulfill Cook’s promise of “being smarter and more useful than ever.” Apple began selling Macs with these processors in 2020, iPads Pro in 2021, and iPads Air in 2022. All iPad mini and iPad models to dry they are left out of the new smart functions.

Nor will this leap to generative AI reach the Vision Pro mixed reality glasses, despite the fact that it is one of the most modern and sophisticated devices from the apple company and is equipped with an M2 processor, with more than enough capacity.

Promising functions, but in the medium term

Although it was also highly anticipated on the Apple Watch – where the assistant works worse than on the iPhones – and on the HomePods – which are only controlled by voice with a “Hey, Siri”, just like the sound systems that respond to the calls from “Alexa” or “OK, Google”—the new Siri will not arrive at the moment either on watches or smart speakers or Apple’s video player.

The company has not provided any information on whether the current models of the different platforms that are outside the new AI system will be able to access any of its functions in the future, either by processing on the device itself or by accessing the same network. of private cloud computing that Apple has just presented and that it will use to solve the most sophisticated Generative AI requests. There is also no detail about when these functions will reach other languages ​​such as Spanish, and the technology giant only warns the following: “Apple Intelligence will be available in beta version this fall in United States English. “Some additional features, software platforms and languages ​​will be added next year.”

The absence of the new Siri in smart watches and speakers has disappointed the expectations of some of the most experienced analysts in the Apple world. Jason Snell, of SixColors and former magazine editor macworld, had declared, hours before the keynote: “When I go out for a walk I usually go with just my Apple Watch and my AirPods. In theory I’m covered by Siri, but in general I never talk to it because it’s not reliable (on the watch).” Mark Gurman, of Bloombergwho achieved the detailed leak of all the key announcements of the keynote days in advance, he predicted that Apple’s commitment to AI would have the advantage that the company can bring it to all its devicesnow warns that That process will take years to produce results and be completed. and, for now, “it won’t do much more than give a big boost to sales of this year’s new iPhones.”

You can follow The USA Print in Facebook and x or sign up here to receive our weekly newsletter.