One of the main benefits of using Android Auto and CarPlay is the reduced distraction behind the wheel, as they allow drivers to keep their eyes on the road and still interact with apps on their mobile devices.
The voice command integration allows users to control the most common activities hands-free, including making and answering phone calls, sending texts, and reading incoming messages. Google Assistant and Siri can read the texts you receive, as Android Auto and CarPlay do not come with a dedicated interface to display the message body.
As a result, drivers won't be tempted to pick up their mobile phones and read the message, as the integrated digital assistants can read out loud the text.
Starting in a recent update, CarPlay users are provided with even more advanced functionality. Siri can now describe the photos you receive in a text message, regardless of the app. If someone sends you a photo on iMessage, the digital assistant can describe what happens in it, though I noticed that the provided details aren't often as accurate as you'd expect.
Before this update, Siri only notified CarPlay users that "contact sent a photo." With this silent improvement, Siri tells you that "contact sent a photo with [description]." I tested the feature with a photo of my dog running after a ball, and Siri correctly described it. "Your wife sent a photo of a dog running after a ball," it said.
The feature doesn't seem able to identify people's names from the iPhone contact list or the photo gallery, so you'll only hear generic descriptions whenever someone sends you a photo.
The feature showed up for me and other users earlier this week, though some claim they've been seeing it for at least several months. Maybe Apple used a gradual rollout, though I suspect the company enabled it recently after the release of iOS 17.
Siri likely uses information from Visual Look Up, a feature introduced in iOS 16 that allows the iPhone to identify and provide more information about pets, landmarks, plants, and other objects in your photos. For example, Visual Look Up can scan your pictures for certain foods and then search the web to provide a related recipe based on the detected ingredients.
Additionally, similar capabilities have been provided since iOS 16 by VoiceOver, which could describe what happens in a photo by describing people and objects.
However, the feature looks new on CarPlay, making sense from a distraction perspective. Drivers can use it to avoid looking at the mobile device, and despite the current descriptions not being spot-on all the time, the iPhone maker will hopefully further refine it in the coming updates.
I'm still seeing mixed reports on this feature's availability, so it may not be active on everybody's device yet.
Android Auto desperately needs similar capabilities, especially considering Google's focus on new-generation assistant features. However, Google Assistant has been broken in the most horrible ways on Android Auto, sometimes not even being able to perform basic actions like starting a phone call.
As a result, drivers won't be tempted to pick up their mobile phones and read the message, as the integrated digital assistants can read out loud the text.
Starting in a recent update, CarPlay users are provided with even more advanced functionality. Siri can now describe the photos you receive in a text message, regardless of the app. If someone sends you a photo on iMessage, the digital assistant can describe what happens in it, though I noticed that the provided details aren't often as accurate as you'd expect.
Before this update, Siri only notified CarPlay users that "contact sent a photo." With this silent improvement, Siri tells you that "contact sent a photo with [description]." I tested the feature with a photo of my dog running after a ball, and Siri correctly described it. "Your wife sent a photo of a dog running after a ball," it said.
The feature doesn't seem able to identify people's names from the iPhone contact list or the photo gallery, so you'll only hear generic descriptions whenever someone sends you a photo.
The feature showed up for me and other users earlier this week, though some claim they've been seeing it for at least several months. Maybe Apple used a gradual rollout, though I suspect the company enabled it recently after the release of iOS 17.
Siri likely uses information from Visual Look Up, a feature introduced in iOS 16 that allows the iPhone to identify and provide more information about pets, landmarks, plants, and other objects in your photos. For example, Visual Look Up can scan your pictures for certain foods and then search the web to provide a related recipe based on the detected ingredients.
Additionally, similar capabilities have been provided since iOS 16 by VoiceOver, which could describe what happens in a photo by describing people and objects.
However, the feature looks new on CarPlay, making sense from a distraction perspective. Drivers can use it to avoid looking at the mobile device, and despite the current descriptions not being spot-on all the time, the iPhone maker will hopefully further refine it in the coming updates.
I'm still seeing mixed reports on this feature's availability, so it may not be active on everybody's device yet.
Android Auto desperately needs similar capabilities, especially considering Google's focus on new-generation assistant features. However, Google Assistant has been broken in the most horrible ways on Android Auto, sometimes not even being able to perform basic actions like starting a phone call.