Everything Apple Intelligence will do for you (so far)on June 14, 2024 at 15:31 Computerworld

0

While the arrangement between OpenAI and Apple is attracting a lot of attention, Apple has put together a sizable number of its own large language model (LLM) tools that will run on a compatible device or in its secure cloud, Private Cloud Compute

Apple Vice President Craig Federighi calls Apple Intelligence, “the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac.”

To achieve this, it draws on what your device knows about you and on-device intelligence, or, where necessary, in the cloud via the highly secure Private Cloud Compute system. At all times, Apple says it’s working to protect user privacy, which means your data is protected unless you choose to use a third-party AI, such as ChatGPT. 

In making these solutions, Apple has paid particular heed to creating tools that offer truly useful help. The critical idea is that they get things done for you without getting in the way of the easy user interaction you usually enjoy with your Apple product. 

Tools to help you write better

To help you write, Apple Intelligence can proofread and rewrite your text anywhere across your system, including within third-party apps. Apple Intelligence will also summarize a meeting transcript, long email, and website content; pretty much any large block of text can be made bite-sized.

Be warned, for some of these functions Apple Intelligence might need to use ChatGPT, but you’ll be told if that is the case and can cancel the request rather than sharing your information with a third-party service provider.

Mail is getting better

We wrote a little more to explain how Mail works here.

The system works to figure out which of your incoming emails are most essential and places those emails at the top of your Inbox.

AI will also create what it thinks are appropriate replies for you — you don’t have to use them and do get to approve them before they are despatched.

Meetings, now with AI assistants

Tap record when making a call or when inside a Note to capture audio recordings and transcripts. Once the call or meeting ends, Apple Intelligence will quickly generate a summary of the transcript.

Tools to help you stay focused

There’s a new automated Focus mode that reduces interruptions but is also intelligent enough to let important notifications break through. Apple Intelligence will also get to know which of your notifications matter to you most and make sure those are at the top of your notifications list. The idea is to optimize your attention so you can stay on top of the things.

Making images

Apple’s on-device LLM engine will create original images for you based on a typed request. Usefully, it will also remove unwanted objects in an image on request. And a new Image Playground app lets you experiment with ideas and try different image styles to create your own images.

Photos gets better at helping you find your stuff

AI features in Photos include far more powerful and contextually-aware Search results and the ability to create a Memories video based on such a search.

Introducing, Genmoji

If like me you have problems finding precisely the right emoji or aren’t really certain if any that you do choose to have a double meaning, then salvation is at hand! Genmoji makes it possible to create completely original emoji on demand; just tell your Apple device what you want and up it will pop.

Wave your Image Wand

This feature needs an Apple Pencil. It works like this: Open a Note, draw a circle where you want your generated image to appear in that Note, and Apple’s intelligence will make you a custom image that reflects the contents of it.

Siri gets serious attention

We’re being promised lots of improvements in Siri; not only will it be able to better understand more complex or poorly articulated requests, but it also gains the kind of contextual understanding you need to figure out answers to complex questions such as “Show me the recipe Sacha sent me the other day.” 

That has several implications, including:

Siri knows what you are looking at and you can make requests that reference that, such as adding an address to your Contacts, or adding something to a note in a different app.

The assistant can also now answer questions about any of your Apple devices or operating system features, like an Apple Genius in your pocket.

Siri also now understands typed requests — double tap the bottom of the display and a keyboard pops up.

Sometimes your device might need to use ChatGPT to fulfill some requests; you will be told if that is the case and can cancel the request.

Apple has also given asking Siri questions a new vibe; when you do so, your device now will show a glowing light all around the borders of the screen. 

Is there more to come? Probably

It is likely there will be additional features in place by the time Apple Intelligence is made available in the fall product software updates. This is because developers can use App Intents to make features available within their apps also available across the system. Meanwhile, developers get to use Xcode Complete to work smarter.

Apple is also thought to be pushing other genAI firms beyond OpenAI to offer their services on its platforms, while the company hopes to generate new income streams as developers build and make available additional LLMs on its platforms.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Leave a Comment