- 구글 클라우드, 구글 워크스페이스용 제미나이 사이드 패널에 한국어 지원 추가
- The best MagSafe accessories of 2024: Expert tested and reviewed
- Threads will show you more from accounts you follow now - like Bluesky already does
- OpenAI updates GPT-4o, reclaiming its crown for best AI model
- Nile unwraps NaaS security features for enterprise customers
Why you shouldn't buy the iPhone 16 for Apple Intelligence
In August, my friend and colleague David Gewirtz explained why he considers the upcoming iPhone 16, with its focus on iOS 18 and Apple Intelligence, an essential upgrade. While I value David’s perspective, I have a different take.
Also: 6 AI features iPhone users can expect first on iOS 18 (and what’s coming later)
David argues that the incorporation of artificial intelligence (AI) in iOS 18 makes the iPhone 16 a necessary upgrade, emphasizing the potential of Apple Intelligence to revolutionize our interaction with devices. While I agree that Apple Intelligence has long-term potential, I’m not convinced that its first iteration will deliver the game-changing usability that many anticipate.
If anything, the iOS 18.1 betas with Apple Intelligence features were underwhelming at best (not to mention the many iOS 18 bugs that have now been reported).
The annual upgrade ritual
Every year, my wife and I eagerly await the release of the new iPhones. Being part of Apple’s Upgrade Program, we return our devices, reset our loan with Citizens Bank, and acquire the latest model. Over the past few years, I have opted for the Pro Max, and my wife has chosen the base model. The expected annual improvements have been incremental but appreciated.
Also: The best phones to buy in 2024
Despite the buzz around the iPhone 16’s new features and the integration of Apple Intelligence, however, several concerns dampen my enthusiasm for upgrading this year.
Apple Intelligence: A significant, yet incomplete, leap forward
Apple Intelligence represents a significant leap in on-device AI capabilities, directly bringing advanced machine learning and natural language processing to our phones. Unlike typical iOS or MacOS feature upgrades, Apple Intelligence loads a downsized version of Apple’s Foundation Models, a home-grown large language model (LLM) with approximately 3 billion parameters.
While impressive, this is tiny compared to models like GPT-3.5 and GPT-4, which boast hundreds of billions of parameters. Even Meta’s open-source Llama 3, which you can run on a desktop computer, has 8 billion parameters.
Also: I broke Meta’s Llama 3.1 405B with one question (which GPT-4o gets right)
The iOS 18.1 Developer Beta, released at the end of July, introduced the first Apple Intelligence features, but they’ve been modest so far. These include Writing Tools, which allow users to rewrite emails, texts, or letters in different tones, proofread content, and summarize or format it with tables or bullet points.
Another feature provides webpage, email, or text summaries — a time-saver that cuts straight to the content. Users can also ask the Photo app to search for specific images, such as those showing someone holding a phone.
However, the integration of Apple Intelligence is not without challenges. The model, when running, may occupy between 750MB and 2GB of RAM, depending on how effective Apple’s memory compression technology is. This substantial allocation of memory to a core OS function that won’t always be used means that parts must be dynamically loaded in and out of memory as required, introducing new system constraints and potentially putting additional stress on the CPU.
Also: How to run dozens of AI models on your Mac or PC – no third-party cloud needed
More advanced AI features, such as Genmoji for creating custom emoji, the Image Playground for on-device image creation, and ChatGPT integration, are expected to roll out in October. Siri has received a minor UI update and a more conversational tone, but significant improvements are also expected in October, some even as late as January.
New hardware
Earlier, I discussed how older — as well as current generation — iOS devices aren’t powerful enough to handle on-device Generative AI tasks. The base iPhone 15, which has only 6GB of RAM, would struggle to meet the demands of Apple Intelligence as it evolves and becomes more integrated into iOS, core Apple applications, and developer applications. Older iPhones have 6GB of RAM or less, and are not eligible to run Apple Intelligence in current iOS 18.1 builds.
Other than the new iPhone 16 lineup, due to their 8GB of onboard RAM, only the iPhone 15 Pro and the iPhone 15 Pro Max can run Apple Intelligence, and it is currently a beta feature that can be enabled or disabled at user preference within Settings.
Also: Does your iPhone have enough space for Apple Intelligence? How to find out
The base iPhone 16 features the A18 processor with 8GB of RAM, while the iPhone 16 Promodels sport the A18 Pro with 8GB of RAM. This increase in memory is crucial, considering the demanding nature of Apple Intelligence features. However, whether it will fully address performance concerns when Apple Intelligence is fully released remains to be seen.
Interestingly, despite these hardware upgrades, Apple kept prices similar to those of the iPhone 15 series, with the base iPhone 16 starting at $799. The iPhone 16 Pro, however, starts at $999, a $100 increase from its predecessor, likely due to the additional storage and upgraded components. The Pro models also introduced Wi-Fi 7 connectivity, a new telephoto lens, and larger screens — 6.3 inches for the Pro and 6.9 inches for the Pro Max.
Also: Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025
Despite these upgrades, the iPhone 16 may still face challenges due to design cycles that didn’t fully account for the scope of Apple Intelligence’s capabilities. As a result, users may experience suboptimal performance and a less seamless user experience, especially as more AI features roll out in subsequent updates.
Why you shouldn’t buy the iPhone 16 for Apple Intelligence
Besides memory concerns, AI processing demands a lot of power and additional computing resources. Without significant advancements in battery and power management technology, users might have to charge their phones more often. This can lead to increased battery drain, reduced battery lifespan, and potential performance issues. The extra processing power needed to run on-device LLMs could strain the CPU, causing the device to heat up and affecting its overall performance and reliability.
Also: Does this iPhone feature actually help protect your battery? The numbers don’t lie
For these reasons, I see the iPhone 16 (and potentially even the iPhone 17) as a transitional product in Apple’s journey toward on-device AI.
How Apple Intelligence will likely evolve
Apple’s AI capabilities are expected to improve significantly in the coming years. By 2025, we may see more advanced and dependable integration of Apple Intelligence not only in mobile devices and Macs, but also in products like the Apple Watch, HomePod, Apple TV, and a consumer-oriented version of the Vision headset.
To extend Apple Intelligence to these less powerful devices, as the company is doing with its “Private Cloud Compute” by running secure Darwin-based servers in their data centers for more advanced LLM processing, Apple might leverage cloud-based resources for these less-powerful systems through fully developed data center capabilities and partnerships with companies like OpenAI or Google.
Also: Best early Prime Day Apple deals to shop in October 2024
Alternatively, Apple could consider a distributed or “mesh” AI processing system, where idle devices within a household or enterprise can assist less powerful ones with LLM queries.
Apple could achieve this by equipping MacOS, iOS, and iPadOS with Apple Intelligence and the on-device LLM as planned. Subsequent changes could enable all devices to communicate their generative AI capabilities and idle processing state. This would allow them to act as proxies for each other’s Apple Intelligence requests.
Enterprises may also employ a mobile device management solution to facilitate access to on-device LLMs with business Macs. Additionally, iPhones or Macs could be used as proxies for Apple Watch or HomePod requests for mobile users. We may also see a more powerful Apple TV with more onboard memory and processing to act as an Apple Intelligence “hub” for every Apple device in a household.
Imagine your iPhone using the unused processing power of your Mac or iPad, all equipped with on-device LLMs, to tackle complex AI tasks. This would increase the accessibility of AI features across Apple’s product range.
I’m still optimistic
Despite the hype around Apple Intelligence, there are many other reasons to consider upgrading to the iPhone 16, like improvements in camera quality, display, and overall performance. The iPhone 16 features better sensors, enhanced computational photography, and superior video capabilities. The display has also seen improvements in brightness, color accuracy, and refresh rate, making it a better device for media consumption and gaming.
Also: In a surprise twist, Meta is suddenly crushing Apple in the innovation battle
If, however, you’re considering the iPhone 16 solely for its AI capabilities — which are still evolving and unlikely to deliver the expected performance touted in Apple’s WWDC 2024 keynote — you might want to manage your expectations.
This article was originally published on June 28, 2024, and updated on September 30, 2024.