Apple Intelligence: A Promising Yet Underwhelming Leap

Apple's Intelligence

Apple’s introduction of “Apple Intelligence” at WWDC 2024 set high expectations, with the company teasing transformative capabilities that blend artificial intelligence with seamless integration across its ecosystem. However, nearly six months later, we have only seen glimpses of its potential, with the full rollout scheduled for March 2025.

Here’s a look at the key features available so far, and whether they live up to the hype!

The Vision Pro: Not So Much for Apple Intelligence

First things first, Apple’s most futuristic device, the Vision Pro, is curiously absent from the lineup of devices receiving Apple Intelligence updates. While it’s still early days for Vision Pro, it’s surprising that the platform is left out of this feature set entirely.

Writing Tools: A Mixed Bag

Apple Intelligence’s writing tools, available on iPhone, iPad, and Mac, aim to enhance user productivity with generative AI. These tools let you refine text by selecting modes like “friendly,” “professional,” or “concise,” each making adjustments to tone or brevity.

While features like proofreading (for capitalization and punctuation) and summarizing large documents sound promising, limitations persist. The summary and make table functions struggle with large files, and the tools’ repetitive interface is unintuitive. Despite offline functionality and quick results, there is minimal practical value.

Writing tools work as intended but lack the polish and adaptability to feel transformative. For now, their impact is more incremental than groundbreaking.

Image Playground

Image Playground offers users a creative space to generate cartoon-style images using generative AI. Accessible via the Playground app, users can describe any scene they want or use pre-set suggestions for inspiration. Notably, it can incorporate familiar faces from your Photos app and add themes, props, or backgrounds.

It works entirely on-device and generates results quickly, though always in a cartoon style to avoid the complexities of photorealistic imagery. While it won’t create potentially controversial content, clever combinations of elements can sometimes bypass restrictions.

Despite its quirky appeal, the feature’s practicality remains questionable. As it’s still in beta, unexpected outcomes are common, and many users may find themselves experimenting briefly before moving on.

Notification Summaries: A Hit or Miss

Apple Intelligence’s notification summaries aim to streamline notifications by condensing multiple alerts from an app into a single, concise summary. However, in practice, the feature struggles to deliver meaningful value.

Summaries often fail to improve clarity or usability, with condensed notifications providing little additional insight. In general, summaries seem unnecessary or even comical, spawning memes about their inaccuracy. While it may occasionally work as intended, its inconsistency and limited real-world benefit have led us to turn the feature off entirely.

Recording Summaries: A Promising But Incomplete Feature

Apple introduced a feature with great potential: recording summaries. On paper, it’s just fantastic—record your conversations, transcribe them, and receive summaries for later review.

However, it’s not quite as simple as expected. Unlike a built-in function in the Voice Memos app, this feature is tied to the Notes app, and here’s where the confusion starts. After recording a call (with a warning message to the person you’re talking to), the transcription is quite accurate, even identifying individual speakers. But, when it comes to the summarizing part, things get murky.

Apple Intelligence: recording summaries
Image by Canva

The first summary—shown in the preview—works well, but when you open it up, there’s a second, sometimes less accurate summary. Despite its promise, the feature feels disjointed, and its implementation in Notes feels awkward. Apple could make this a lot more intuitive by integrating it into the Voice Memos app, making it simpler to record and summarize lectures or meetings in one streamlined process.

Jen Moji: Fun but Niche

The Jen Moji feature lets you create custom emojis with generative AI. In apps like Messages, you can describe your desired emoji, hit “create,” and within seconds, a unique AI-generated emoji appears. While fun and quirky, its practical use is limited. It handles benign requests well—like an “avocado with a hat”—but may refuse graphic or extreme ideas.

For those who enjoy personalized expressions, Jen Moji offers creativity and novelty, though it may not appeal to everyone. Ultimately, Jen Moji is an amusing but niche feature that may resonate with a specific audience, leaving others indifferent.

Priority Notifications: A Step Forward

Priority Notifications aims to improve Apple’s long-standing challenge with notification management. Designed to work seamlessly in Focus mode, this feature surfaces important notifications while suppressing less relevant ones. The functionality extends to the default Mail app, where AI prioritizes high-importance messages, categorizing them for quick access. While useful for those using Apple’s native Mail client, the feature mirrors capabilities Gmail has offered for years.

The Reduced Notifications mode aims to filter out non-essential updates, allowing you to focus on what truly matters. Though not revolutionary, it’s a welcome improvement for managing digital clutter!

Photos App: A Small but Useful AI Addition

The Photos app has been heavily criticized, with many users frustrated by recent changes. However, Apple has introduced a useful AI feature in response: the background object removal tool.

The tool allows you to effortlessly remove unwanted objects from your photos. When you select the Clean up tool, the app highlights the object it detects, and with a single tap, it removes it, seamlessly filling in the background using generative AI.

It works best with repeating patterns or simple backgrounds, making it more effective than some alternatives. For those seeking a quick and efficient way to clean up their photos, this addition stands out as one of the more practical tools available in the updated Photos app.

Visual Intelligence

The Visual Intelligence feature, exclusive to iPhone 16 and 16 Pro, offers a sleek, full-screen viewfinder with a simple, intuitive interface. By long-pressing the side button, you can quickly access the camera control panel, featuring three key buttons: Shutter, Ask, and Search. The “Ask” button allows to query ChatGPT about what the camera sees, while “Search” provides reverse image search via Google or ChatGPT.

The shutter captures a photo and gives you the option to search or ask again. While the UI is polished and the functionality works swiftly, it doesn’t introduce anything truly new. Similar features, such as Bixby AI’s object recognition, have been available since the Galaxy S8 in 2017. Apple’s version is undoubtedly more accurate and integrated with GPT. Is it a groundbreaking new feature? No. But, is it more accurate and more capable than before? Yes.

ChatGPT Integration

Siri’s integration with ChatGPT also provides a notable update. Users can now request complex queries like travel itineraries, and Siri will offer to switch to ChatGPT for more detailed responses.

Apple ensures privacy by masking user data, so OpenAI doesn’t collect personal info. Users can also log in to their ChatGPT account to track queries, access advanced models, or exceed free limits, with an option to upgrade to ChatGPT Plus.

This integration expands Siri’s functionality, making it smarter and more versatile for complex tasks. However, outside of easier text input, Siri’s functionality hasn’t dramatically changed yet.

In Conclusion: Apple’s Intelligence Is Still Finding Its Footing

Apple Intelligence was positioned as a transformative update, but so far, the results have been somewhat underwhelming. The writing tools are decent but not revolutionary, while features like notification summaries and generative emojis feel more like novelties than essential additions.

The AI-based photo ‘Clean up’ tool, on the other hand, is one of the few truly useful and impressive features. With full functionality expected by March 2025, Apple still has time to refine these features and introduce new ones that live up to the initial hype. For now, Apple Intelligence remains a work in progress—an exciting glimpse into the future, but not yet the game-changer it promised to be.