Google has beaten Apple to the AI punch. We knew it was likely what with the news some of the most ambitious Apple Intelligence features had been delayed and the resulting Siri executive shakeups. But now Google is rolling out more contextual abilities to Gemini on select Android devices.

Google confirmed that it began rolling out Gemini’s Project Astra-powered camera and screen share capabilities this week. It’s available for those paying for Gemini Advanced and those on the Google One AI Premium plan. The features let you share your screen like you’re broadcasting it to a class or use the live camera to get Gemini’s input on stuff around you. You can take Gemini into the real world and use it to identify things, a la Google Lens, or get its guidance on your own projects–the example Google uses is have it help you shop for tiles at a store.

There are only a few reports scattered around the internet from users who can invoke the features thus far, including this one from a user on Reddit, which is making this seem like less of a rollout than it is. I’ve confirmed with Google that the rollout is underway, though I’ve been unable to access it myself. The first user to report using it isn’t even on a Google Pixel device. They’re on a Xiaomi smartphone. 9to5Google has managed to get Gemini Live with screen-share going, enough to show in a YouTube video.

© YouTube
A screenshot from a video Google put together to show Gemini Live in action with camera access.Gemini’s ability to “see” the world around it offers an edge. The iPhone has what Apple Intelligence calls “Visual Intelligence,” which uses a camera to identify objects and text. Its current implementation is pretty limited and does not converse with you or engage in a back-in-forth about the colors of the tile the way that Gemini is shown doing in Google’s demonstrations. It’ll be interesting to see how Google hones on these abilities. Heck maybe it will even finally convince users of its utility, which would only further drive the stinger into Apple for trailing behind.

I saw Project Astra last year during Google’s annual developer conference, where I experienced the camera identification features in action. It didn’t move the needle for me then, but it’s decently impressive Google has pushed Gemini’s ability to see through the camera lens less than a year after debuting it when you consider how badly Apple has apparently fumbled similar features.

I look forward to when live video and screen-sharing capabilities roll out to my Pixel 9 Pro to get a feel for how I’ll apply it to my real life. Google didn’t specify the devices that were getting it, but I’d check your Pixel or Samsung Galaxy device in the coming weeks.



Source link