For a presentation that stretched across two hours,I/O 2024 keynotewas short on the kind of impressive demos Google’s developer conference was known for in years past. Very few of the teasers the company shared on Tuesday generated the kind of excitement that its first demo of what would eventually become the Pixel’s Magic Eraser tool did back in2017.

Google showed off a multi-modal AI assistant that could see, hear, converse, and, most impressive of all, remember.

Untitled design (87)-1

Google I/O 2024: The 13 biggest announcements from the show

Android 15 wasn’t the focus at all. Instead, it was AI, AI, AI.

That is until DeepMind CEO Demis Hassabis took to the stage toannounce Project Astra. In the two-minute demo that followed, Google showed off a multi-modal AI assistant that could see, hear, converse, and, most impressive of all, remember. In the final moments of the demo, that final capability was put on display when a person off-screen asked the software if it recalled seeing their glasses. The audience erupted into applause when the assistant, powered by Google’s Gemini 1.5 Pro model, said it had seen a pair of glasses on a desk it saw only moments before.

Project Astra

Following the final address, I got to see the assistant in action.

At the end of the demo, Hassabis announced, to the surprise of everyone in the audience, that Google had a live demo of Project Astra to show to attendees after the keynote. Following the final address, I got to see the assistant in action. What I saw was impressive, but also an indication that Google still had a lot of work ahead of it before Project Astra is reliable enough to ship to consumers.

Google Gemini

What happens when a German shepherd, mouse and flamingo walk into an AI demo?

The first part of the press demo saw Google showing off Gemini’s alliteration skills. A Google employee placed a few plushies, including one of a banana and another of a hotdog, in front of a camera, and asked the assistant questions about the objects. I’ll admit some of the responses were clever. Of the hotdog, for instance, the software said it could be part of a “neat nosh.” To nosh, I later learned, means to eat a snack. Full points for creativity.

Later in that same demo, the employee leading the showcase asked Gemini to say something about the nutritional value of the items on the table. Rather than say something substantive, Gemini turned to a platitude, noting how “colorful” foods are a great way to eat healthily. The employee tried to prompt Gemini to provide a more meaty answer, but wasn’t successful. They then admitted the fact Gemini was restricted to alliterating its response might have tripped the AI up.

Android 15 beta

11 annoying tasks Google Gemini will soon handle for you

Gemini 1.5 Pro will soon be able to answer questions about the world around you using video, among other key updates from Google I/O.

Unfinished project, but Google projects confidence

One thing the two employees running the demo were quick to remind us throughout was that Project Astra isn’t a finished product. That might seem like a strange admission, but if I’m being honest, it was refreshing to see Google avoid trying to hide Gemini’s imperfections by staging an overly polished demo. Indeed, the software made plenty of goofs during the 10 or so minutes I got to see it in action.

It was refreshing to see Google avoid trying to hide Gemini’s imperfections by staging an overly polished demo.

For example, toward the end of the demo, one of the Google employees asked Gemini to memorize the names of three animal plushies she put in front of the camera. In order, there was Sam the German Shepherd, George the mouse and Lily the Flamingo. She then asked Gemini questions about the stuffed animals, including one about the order she put them in front of the camera, and here the software ran into trouble. It didn’t get the order of the plushies correct.

“George was the first friend you introduced to me,” Gemini said, with typical AI confidence. To its credit, the software admitted it was wrong when the employee pointed out the error. “Yes, you’re right. Sam was the first one.”

Is Astra the ultimate Pictionary partner?

But even taking those errors into account, I did feel I was seeing a glimpse of the future. Maybe not one that involves an artificial general intelligence, but at least something that could be helpful to millions of people, particularly those with disabilities.

“That’s a great stick figure,” Gemini proclaimed on seeing the drawing. “It’s very complimentary,” the employee doodling, noted.

My favorite part of the demo saw the two employees play Pictionary with Gemini. One of the workers began drawing a stick figure. “That’s a great stick figure,” Gemini proclaimed on seeing the drawing. “It’s very complimentary,” the employee doodling, noted. They then added a skull emoji so that the stick figure was holding it, and asked Gemini to take a guess. “Is it Hamlet?” The assistant asked. “Yes, that’s right,” the employee said. It was a whimsical exchange that managed to erase some of the skepticism I had when I first saw Google demo Project Astra.

The second Android 15 public beta is live, and on many more phones

There are still a few big gaps in beta support.

Still, more than anything, the showcase I saw reaffirmed that an all-knowing, all-helpful AI assistant is years away. In speaking to the two Google employees who hosted the demo, I found out Astra’s ability to “remember” is currently limited to a single session, and only then a few minutes at that. Additionally, the assistant is dependent on assistance from the cloud, rather than being able to run exclusively on-device. I’m sure Google will eventually get past those limitations, but I don’t expect those breakthroughs to be easy or quick.