News

Project Astra leverages advanced multimodal AI technologies, which allow it to process both voice and visual inputs, giving users a more intuitive AI assistant.
Last year at Google I/O, one of the most interesting demos was Project Astra, an early version of a multimodal AI that could recognize your surroundings in real-time and answer questions about ...
Google is rolling out a handful of tweaks to Gemini Live on Android, including improved video resolution when using the Project Astra-powered capability.
Before I first tried Google's Project Astra – three times at Google I/O 2024 – a Google rep asked me not to be adversarial. I’d been asking questions about Astra’s last training date and ...
We've been routinely burned by lofty AI promises for a couple of years now, but if Project Astra's functionality actually reaches users and works as well as what I saw at I/O, I think it could be ...
But Project Astra has at least one exciting advantage over ChatGPT that was obvious from the first demo we saw at Google I/O 2024. Unlike OpenAI, Google did not give us a live demo of Project Astra.
Google announced on Tuesday during Google I/O 2025 that Project Astra — the company's low latency, multimodal AI experience — will power an array of new experiences in Search, the Gemini AI ...
If Project Astra sounds familiar, that’s because OpenAI demoed a similar feature for ChatGPT — powered by the new GPT-4o model — just a day ago.
Google described Project Astra as a “universal AI agent helpful in everyday life.” The company went on to show several examples of Project Astra being used in conjunction with a camera viewfinder.
I didn't expect Google Glass to make a minor comeback at Google I/O 2024, but it did, thanks to Project Astra. That's Google's name for a new prototype of AI agents, underpinned by the Gemini ...
One of the most impressive new announcements was Project Astra, a tool that can actually interact with the world through sight. Google also announced that the company is rolling out Gemini 1.5 Pro ...