Google Gemini Live Adds Smart Screen and Camera Tricks

Gemini Live now reads phone screens and camera views with AI features.

  • Gemini Live now reads phone screens and camera views with AI features.
  • Only for Gemini Advanced users, rolling out this month.
  • Helps with tasks like identifying objects or picking colors in real-time.

Google’s Gemini Live just got cooler with new AI features that let it see and understand what’s on your phone screen or through your camera. Announced by Google’s Alex Joseph, this update is part of “Project Astra,” a big plan to make AI smarter. It’s only for people with the Google One AI Premium plan, and it’s starting to roll out now, bit by bit, through March 2025.

With the screen-reading trick, you can ask Gemini Live about anything on your phone—like text, pictures, or apps—and it’ll give you answers based on what it sees. The camera feature is even handier: point your phone at something, and Gemini can tell you what it is, suggest ideas, or help with stuff like choosing paint colors for a craft project. A Reddit user with a Xiaomi phone showed it off first, and 9to5Google confirmed it works as promised in a video.

Google’s ahead of the pack here. Amazon’s working on a fancier Alexa Plus, but it’s not ready for everyone yet. Apple’s new Siri is delayed, and Samsung’s Bixby isn’t keeping up—especially since Gemini fits right into Samsung phones. Google teased Project Astra almost a year ago, aiming to make AI helpers that see and talk naturally. These new features are a big step toward that, making your phone’s AI feel more like a real buddy.