Google plans to release its prototype glasses, armed with AI and AR capabilities, to a small set of selected users for real-world testing. The Project Astra prototype glasses would run on Android XR, Google’s new operating system for vision-based computing. Google wants glasses to be the “next generation of computing”. Google still has nothing concrete to share about the actual product or when it will be released. Google also shared a new demo showing how its prototype glasses can use Project Astra and AR technology. Google has an edge with Project Astra which it is launching as an app to a few beta testers soon. Using Project Astra on your phone is impressive, and it’s likely a signal of what’s coming for AI apps.
The Google’s prototype glasses is essentially vaporware and it seems Google has no clear timeline for a consumer launch of this prototype. It is now starting to let hardware makers and developers build different kinds of glasses, headsets, and experiences around Android XR. Project Astra works by streaming pictures of your surroundings, one frame per second, into an AI model for real-time processing. While that’s happening, it also processes your voice as you speak. Some members of Google DeepMind also showed how Astra could read your phone screen. OpenAI has also demoed GPT-4o’s vision capabilities, which are similar to Project Astra and also have been teased to release soon. It might take them a while to make that a reality.