Apple’s AI play, with a difference: visionOS, Journal, video calls, AirPods

Apple has, across the next editions of iOS, iPadOS and macOS invoked AI to a great extent. (HT Photo)


It was never likely that Apple would mutter “AI” as much as Google or Microsoft lately do, in a keynote. Or launch a chatbot, just because everyone else is. But for anyone who feels the company isn’t focusing on artificial intelligence (AI) as much as the other big tech, the WWDC 2023 developments should put an end to those doubts. Apple has, across the next editions of iOS, iPadOS and macOS as well as multiple apps that have been extensively reworked, invoked AI to a great extent. And then there is the Apple Vision Pro augmented reality (AR) headset which requires some neural network smarts too.

Apple has, across the next editions of iOS, iPadOS and macOS invoked AI to a great extent. (HT Photo)

The next operating system for the iPhone, called iOS 17 (and it rolls out later this year) is reworking quite a few of Apple’s own apps. One of them is the Phone app, and specifically the Voicemail feature. In case you don’t use it already, this might convince you to – there will be a live transcription available for any voicemail message being left for you, and if you still feel it is important, you can pick up the call at any point during the message delivery and transcription process. Apple says the transcription happens on the device.

Also Read: As Microsoft builds with AI, chatbot for India indicates wider community focus

If ever there remained any apprehension still, the use of natural language models are in play with the improvements to auto-correct that are incoming, new word and sentence autocorrect with focus on grammar, new speech recognition transfer model for voice typing, along with upcoming features including transcription of a voice message in iMessage, the new Journal app that’ll use context for smart suggestions, the Standby mode for time and context and using machine learning (ML) to synthesise slow-motion by adding more frames to the iPadOS’s upcoming lock screen customisation.

Is this the end of the era of “ducking”? We’ll know soon enough.

Apple’s AirPods wireless earbuds will soon add the Adaptive Audio tech, which will use machine learning to decode a user’s present (and often rapidly changing) environment to dynamically blend transparency and automatic noise cancellation. If you are in a [public transport for instance, just enough transparency (for those specific frequencies) will be enabled, so you don’t miss any announcements.

How well the Adaptive Audio feature works, and across different noise levels and noise compositions, we will know in due course once the feature rolls out and we use it extensively. Theoretically, if someone comes to speak with you, their voice will filter through to you, but it is likely a lot of the ambient din will remain blocked out. At least that’s the premise.

For the visionOS driven Apple Vision Pro headset, AR is very reliant on AI and machine learning, to potentially deliver the immersion, privacy, and extensive experience it sets out to deliver. The OpticID technology, which will be the AR version of an iPhone’s FaceID biometric recognition and authorisation technology, will require complex algorithms to process iris data on the device. This will open access to the apps on visionOS, as well as enable App Store purchase authentication and Apple Pay transactions too.

Apple confirms that all camera data collected by the Apple Vision Pro headset, is also processed on device. Think about it – that is the data collected by 12 cameras, five sensors, and six microphones on the headset.

Last but not least is the Journal app, which will be released with iOS 17. As the name suggests, it is a journaling app, which the tech giant sees as a wellbeing extension for the Fitness, Sleep and Breathe apps. It’ll use extensive algorithms to collect data from a user’s contacts, photos, music, location data and more, to curate personalised suggestions – though what can be accessed to curate suggestions, is fully controllable for the user.

It was never likely that Apple would have announced a chatbot, something like OpenAI’s ChatGPT or Google’s Bard, at WWDC 2023. Expectation from many, particularly the conversations on social media, seemed to hope for that. But Apple’s path was always going to be different. One where AI lends an underlying smartness to the experience of using an app or a feature. One where AI isn’t the center of attention, but a means to an end. With the examples we have pointed out, it is safe to assume that it is mission successful. At least as far as laying the foundations is concerned.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *