Light but with a limited field of vision

One of Google E / S’s greatest revelations is that the company is officially back in the mixed reality game with its own intelligent XR prototype glasses. We have seen something substantial for years for research giant on the AR / VR / XR front, but with a bunch of material partners to accompany its XR platform, it seems that it finally changes.
After the speech, Google gave me a very short demo of the prototype device that we saw on stage. I only had a few minutes with the device, so my impressions are unfortunately very limited, but I was immediately impressed by the light of the glasses compared with the Meta prototype and the augmented reality shows of Snap. Although both are large enough, Google’s prototype was light and much more like a pair of normal glasses. The frames were a little thicker than I wear, but not by many.
At the same time, there are notable differences between Google’s XR glasses and what we have seen from Meta and Snap. Google’s device has only one side display – the right lens, you can see it in the image at the top of this article – the visuals are therefore more “glancable” than immersive. I noted during the google demo on stage at the I / O that the field of vision seemed narrow and I can confirm that it feels much more limited than the field of vision at 46 degrees of Snap. (Google refused to share details on the width of the field of vision on its prototype.)
Instead, the screen was a bit similar to the screen before a foldable phone. You can use it to take a look at the time and notifications and small information extracts from your applications, such as the music you listen to.
Gemini is supposed to play a major role in the Android XR ecosystem, and Google guided me some demos of the AI assistant working on smart glasses. I could look at a demonstration of books or an art on the wall and ask questions about the Gemini on what I looked at. It seemed very similar to the multimodal capabilities that we saw with Project Astra and elsewhere.
However, there were bugs, even in the carefully orchestrated demo. In one case, Gemini started to tell me what I was looking at even before I finished my question, which was followed by an embarrassing moment when we stopped and we interrupted ourselves.
One of the most interesting use cases that Google was in Google Maps in the glasses. You can get a advance view of your next round, a bit like Google Augmented Reality Walking Indictions, and look at the bottom to see a small card section on the ground. However, when I asked Gemini how long it would take to go to San Francisco since my location, he could not provide an answer. (He said something like a “tool outing”, and my demo ended very quickly after.)
I also really liked the way Google took advantage of the on -board camera of glasses. When I took a photo, an image preview immediately appeared on the screen so that I can see how it turned out. I really appreciated this because the framing of a camera photos on smart glasses is intrinsically uninstruptive because the final image can vary so much depending on where the lens is placed. I often wanted a version of this when I take pictures with my intelligent Ray-Ban Meta glasses, so it was cool to see a version of this in action.
Honestly, I still have a lot of questions about Google’s vision for XR and any smart glasses powered by Gemini. As with so many other mixed reality demos that I have seen, it is obviously very early. Google has taken care to emphasize that this is the prototype equipment intended to show what Android XR is capable, not a device that it plans to sell anytime soon. Thus, all the intelligent glasses that we obtain from Google or its material partners could be very different.
What my few minutes with Android XR were able to show is how Google plans to collect AI and mixed reality. It is not so different from Meta, who considers intelligent glasses as a key to long -term adoption of his AI assistant. But now that Gemini arrives at almost all the Google products that exist, the company has a very solid base to really achieve it.



