- Google's vision for Search in an AI era is starting to come together.
- Google is rolling out its AI Mode to users in the US, giving them a full AI search experience.
- At its annual I/O event, the company also announced several new Search tricks it has planned.
We're finally getting a good glimpse of what Google looks like when it's fully transformed by AI.
Last year's I/O brought an onslaught of AI announcements and a sense the search giant was trying to prove it still had the juice to lead this race, blasted by a lot of disparate products and demos. This year, a clearer picture is emerging of how Google sees the future of its core products, including what CEO Sundar Pichai called a "total reimagining" of Search during a roundtable with press ahead of the event. This includes a more conversational-type search called AI Mode and eventually an AI assistant that understands the world around you.
Google has faced a major dilemma: Search advertising generates the lion's share of the company's multibillion-dollar business, though Google knows it can't just stay still and let rivals eat its lunch. It's trying to build AI into its core search product before someone else does it better. But not so fast that it risks hurting the company's profit engine.
Google has been inching forward with AI Overviews, and this week it's also unleashing its AI Mode for search to everyone. While AI Overviews summarizes the response at the top of the normal Search page, AI Mode allows users to click a new tab, which opens a conversational-type experience that will surface a more diverse range of sources, all still based on Google's search index. Users can also ask follow-up questions. Google says it's rolling out AI Mode widely to users in the US starting this week.
"AI Mode is not just this AI-powered experience end to end, but it also is a glimpse of what's to come in Search overall," said Google Search head Liz Reid.
AI Mode uses what Google calls a "query fan-out" technique, which means it's running multiple queries in parallel and returning the results back at once. Google says it will make search better and allow users to ask more complex questions.
AI Mode today is just the start of how Google sees search evolving. Google's announcing a bag of new tricks that it's keeping in Labs for now, so they'll only be available to early testers. Still, they show what Google sees as the future of search.
One example is Deep Search, which lets users punch in a super long and complicated question and will return a fully cited report, much like Google's "Deep Research" feature in Gemini. There's also a version that will return real-time data and visualizations (think charts on sports teams statistics, for example).
Google will also let users give AI Mode access to other Google apps and their search history so it can return more tailored answers and recommendations.
Reid said Google will feed some of the features from AI Mode into its standard search engine and AI Overviews, the idea being that Google's standard search experience benefits from the leaps it's making in the underpinning AI models.
"You put all of this together. This really is building the future of Search," said Reid. "Searching starts to feel effortless."
Does Google envision AI Mode being the default one day? That's the implication here, though the company will closely watch over the next few months to see just how many people click the "AI Mode" tab.
The everything assistant
Google also has a vision for an AI assistant that is with you all the time.
If you've seen Google's Project Astra, an AI agent that uses vision to see the world around it, you already have a good idea of what Google is thinking here. Google wants to build an assistant that is with you at all times — be it in your phone or in a pair of augmented reality glasses — that can see the world, answer questions, and relay information to you in a matter of seconds. Or maybe it's just helping you code.
At I/O, Google is announcing it's extending its frontier Gemini 2.5 Pro model to be a "world model," which really just means it's going to be able to understand what it's seeing and, Google says, make plans. In AI speak, it's making it more agentic.
Google DeepMind CEO said these updates are "critical steps" toward building a "universal AI assistant" that can better understand the user and take actions on their behalf.
"This is our ultimate goal for the Gemini app: an AI that's personal, proactive and powerful," Hassabis added.
Google will make its camera and screen-sharing Gemini Live available to everyone with the Gemini App and launch Veo 3, a new video generation model that includes support to combine sound effects.
Google needs to build fast here. While generative AI is not yet a critical business in the way Search is, the company said its Gemini app now has more than 400 million monthly active users. Google's own internal analysis found that Gemini still trailed OpenAI and Meta's apps as of earlier this year, according to documents shown in court.
Have something to share? Contact this reporter via email at [email protected] or Signal at 628-228-1836. Use a personal email address and a nonwork device; here's our guide to sharing information securely.