As seen in the above video, you can have contextual conversations with Assistant to advance your first query rather than repeating the “OK Google” trigger at every step.
In addition, Assistant will be able to open specific apps by simply saying something like “Open Spotify”, though more complex new commands still require the trigger, such as “Hey Google, open Google Photos. Show me the ones on the beach.”
Pixel 4 and 4 XL will be the exclusive phones to get this for a short time as Google tries to incentivise owning its latest hardware. But the features aren’t limited by hardware, so we expect them to trickle down soon to older Pixel phones and other Android devices.
You can make in-app contextual commands such as asking your phone to send the photo you are viewing in Google Photos to a specific contact or saying “Hey Google, reply” when in a chat to dictate your reply handsfree.
The new features all show Google trying to normalise talking to your phone by making new features that are genuinely helpful and natural to say in the moment.