Search On 22, Google unveils the future of Search

Put us to the test
Put us to the test!
Analyze your site
Select the database

Google’s goal, since its inception, has been to “organize information globally and make it universally accessible and useful”. As technology has evolved, this mission has in turn taken on additional meanings, and today, especially thanks to advances in machine learning, doing a search means much more than typing a query in the box and waiting for 10 links to appear. The changes to Google’s SERPs, so obvious and varied, show us how Big G is “getting closer to creating search experiences that reflect how we as people make sense of the world“, and how the company is committed to “unlocking entirely new ways to help people gather and explore information.” Further examples of this work come from the big Google Search On 22 event, during which Google precisely presented the upcoming evolutions coming to the Search system, with many useful applications also for users of Maps, News, for Shopping and so on: in short, here is everything that is changing and that we SEOs should also know about, given the inevitable effects they will have on the very concepts of optimization and ranking.

Google Search On 22, all the news for Search (and more)

Google Search On is Google’s big fall event, dedicated prominently to what remains the company’s core business, which continues to be (let’s remember) first and foremost a search engine, indeed THE most used search engine in the world. As in the previous two editions, technology dominates the scene and the announcements this year, with the presentation of many of the latest features and tools that make it more natural and intuitive to find what you’re looking for, thanks to advances in machine learning applied to “new ways to visually search, explore the world around you, shop with confidence, and make more sustainable choices,” as stated on The Keyword‘s pages dedicated to the theme.

The common thread that binds these interventions is encapsulated by the phrase “search outside the box“, which can have several levels of interpretation: from the most basic point of view, it literally means that the user can search outside the old box, the search box in which it was previously mandatory to type the query, because technological advances make it possible to launch searches in different ways and with different tools as well. But it also means helping people to think (and search) in a less schematic way, thus pandering to their own skills and needs, because the integration of the various tools of Artificial Intelligence, Machine Learning, and even graphical changes will allow access to information that is increasingly easy to reach and extremely useful, because it is calibrated through many more parameters.

In short, usable content that is ever closer to people’s daily lives, as the heads of the individual departments – Rajan Patel for Search, Miriam Daniel for Maps and Matt Madrigal for Shopping – promise, although for the time being these changes will affect the English-speaking population as a priority (but within a few months they should be expanded to include at least a hundred other languages and corresponding citizens worldwide).

The top 10 announcements for Google Search

Full coverage of the new features coming to Google users is available as mentioned on the company’s official channels, including videos on YouTube, but we can still try to summarize what are the most interesting features we will see appearing in Google Search, which could then (once again!) have an effect on organic search rankings and returns.

Following Barry Schwartz‘s work, we’ve identified the 10 most relevant Search On 22 announcements, including Multisearch, Lens apps, auto-completion, search filters, and more-though perhaps none of these are really new, but rather a further evolution of something already presented in the past.

  1. Multisearch expansion

Google is expanding multisearch to 70 new languages in the coming months. Google’s multi-search was announced at Google I/O 2022 and was initially only available for queries in English and the U.S. It is a feature that allows the user to use the smartphone camera to search based on an image, based on Google Lens, then add an additional text query on top of the image search, “similar to how you might naturally point to something and ask a question about it.” Google will then use both the image and the text query to show visual search results.

Multisearch

 

According to Cathy Edwards, VP/GM of Search, already “people now use Lens to answer more than 8 billion questions each month,” searching “the world around with a camera or an image.”

  1.  Near Me Multisearch debuts

One of the first practical applications of Multisearch is poised for implementation in the U.S. search results (and should arrive by late fall): it is near me Multisearch, a system that allows users to take a photo or screenshot of a dish or object, then immediately find it nearby, connecting the user with local search results and businesses in their area.

  1. Google Lens’ translated text is now clearer

Among the various features of Google Lens is the ability to point the camera at text in almost any setting so that that content can be translated. According to Edwards, people use Google each month to translate text into images more than 1 billion times, in more than 100 languages, and this ability to break down language barriers is “one of the most powerful aspects of visual understanding”.

Traduzioni con LensBy the end of the year, Google Lens will be able to present that translated text in a clearer way with a blended approach: thanks to major advances in machine learning, the app can now merge translated text into complex images so that it appears much more natural and easier for users to understand. Moreover, after optimization work on machine learning models, this will be done in as little as 100 milliseconds, in less than the blink of an eye. The system uses generative adversarial networks, also known as GAN models, to better present translated text, a technology similar to what Google uses in Pixel devices for its “Magic Eraser” feature on photos.

  1. Real-time search refinements

Google is implementing new search refinements and aids when in autocomplete and within search results, which work in real time (and will be coming in the next few months). Basically, as we type in the query, Google will present touchable words that add to the string and allow us to extend and build the query on the fly. It’s a form of query generator that works by simply tapping on the words, as seen in the gif showing it working in action:

Perfezionamenti delle ricerche

As we type, we will also see more complete information begin to appear in the autocomplete results, such as preview images or information panels.

  1. Instant visual information with Drill Down

Google has also added an additional way to refine the query after we run the search, allowing us to add or remove topics on which we want to broaden or narrow the field: the ability to click on keyword or topic options helps us to focus the search more, adapting the search bar at the top to navigate more dynamically to results more relevant to our interest. As Edwards explains, “we are simplifying the exploration of a topic by highlighting the most relevant and useful information, including content from creators on the open Web.”

This more visual approach is called Drill Down and is applied to search results related to a few specific queries and will allow us to explore more information on topics related to travel, people, animals, plants, and so on. Specifically, depending on the query Google will show visual stories, short videos, tips, things to do, and more, making the most relevant information in this feature appear and stand out visually as well.

For topics such as cities, for example, we might display visual stories and short videos of people who have visited them, tips on how to explore the city, things to do, how to get there, and other important aspects that might be useful when planning a trip.

Funzione Drill Down

  1. Exploration throughout the scroll

With these features, we will see the most relevant content, from a variety of sources, regardless of the format in which the information arrives, whether it is text, images, or video; and as we continue to scroll through the results, we will see a new way to draw inspiration from topics related to our search.

Scroll to Explore

The new approach seems to solve one of the classic problems with searches: as a rule, the more we scroll through Search, the less relevant the results are (in fact, on a theoretical and ideal level, Google should rank the most relevant information at the top). With this new exploration feature, we as users can have different inspirations about the query, even if it doesn’t exactly match what we typed, so we can discover related topics and information that go beyond our original query and that we would never have thought of, as in the case of the beaches and historical sites of Oaxaca, a perhaps little-known but definitely welcoming and interesting destination.

  1. Box for discussions and forums

American English SERPs will welcome yet another box, dedicated to results from “discussions and forums,” designed to help people find personal experiences on the topic from real people who have posted content in various online discussion forum platforms, including Reddit.

Feature per discussioni e forum

  1. International and local news translated

Instead, the feature by which Google will allow us to find translated news for local and international topics and facts should arrive early in the new year: thanks to machine translation, Google Search will show us translated headlines for news results from publishers in other languages in addition to those written in our preferred language, breaking down language barriers in the news, as Lauren Clark and Itamar Snir (Google Search and Google News product managers, respectively) say.

Traduzione delle notizie

This feature connects readers searching for international news with relevant local news in other languages, giving them access to more comprehensive coverage in the field and making available new global perspectives. As announced by Google, the feature to translate news results will initially be available in French, German, and Spanish into English on mobile and desktop devices.

  1. Customization of About This Result

Google is also expanding the About This Result feature (which is expected to arrive in Italy soon) to provide the ability to customize certain aspects of search results (an option that can also be turned off or changed).

About this result “already tells you about some of the most important factors Google Search uses to link results to your queries, to see if a web page has keywords that match your search, if it contains related terms, or if it’s in the language you’re searching in,” Danny Sullivan reminds us. While generally the words in the query “provide our systems with all the context they need to return relevant results”, there are some situations where showing the most relevant and useful information means “tailoring the results to your tastes or preferences,” adds the public liaison for Search: in these cases, personalized results can make it easier to find content that might be useful to you.

Personalizzazione shoppingCases of application of personalization include situations of shopping with Google (which help to quickly display results for brands and industries the user likes) or even video content related to “what to watch“, with more specific recommendations related to personal tastes than the classic generic answers; by selecting the streaming services we use, we will then receive personalized recommendations on what is available and have quick links to watch the titles we have chosen.

In addition, Sullivan points out, personalized information is not applied to all Google Search results, because Big G’s systems “personalize only when it can provide more relevant and useful information.”

  1. New kitchen inspirations

Instead, a feature, available in English for mobile users around the world, to find more meal inspiration is already active. If we search for “dinner ideas“, for example, we will display personalized recommendations for recipes we might try, with the ability to refine the search if we have specific wants, needs, or dietary preferences.

Cosa cucinare

Also on the food theme, then, Sophia Lin (GM of Food) explained that new Search updates are coming to help us search for specific menu items, display improved digital menus (through the use of cutting-edge technologies for understanding images and language, including the MUM model), and even discover what makes a restaurant special, with more details on user feedback. As well as refining search for specific dishes (as in the example below of dumplings),

Ricerca cibo

Multisearch apps will allow people to identify a food product through a photo or screenshot and, starting in the coming weeks, launch a “near me” search to see nearby businesses where they can buy that product.

Multisearch near me col food

The apps for Shopping and in Maps

Google Search On 22 was also an opportunity to unveil other specific applications coming to the Google ecosystem, specifically for Maps and Google Shopping.

On the first front, four new updates were announced that “make Maps feel more like the real world,” helping us, for example, get a feel for a neighborhood before we leave, explore over 250 landmarks in aerial view, search for nearby places with Live View, and more. The key word is “immersiveness“, because thanks to 3D maps and projections, users can visualize reality spaces on their device, getting, for example, the same visual projection as a person actually walking in that urban content.

It is not just about high-detail maps, but the use of information and Artificial Intelligence to process data (often provided directly by the users themselves) to provide a complete experience with suggestions of all kinds: places of interest nearby, monuments, history, traffic, weather, events and happenings at that time.

As Chris Phillips (VP & General Manager, Geo) summarizes, “Google Maps has always pushed the limits of what a map can do,” and now the company is further reinventing the platform (which already receives 50 million updates a day, based on the ever-changing environment around us) “with a visual, intuitive map that lets you experience a place as if you were there, all thanks to the latest advances in computer vision and predictive models”.

Much more extensive features are coming to Google Shopping, which increasingly configures the Mountain View giant as an eCommerce system that engages the user in a shopping experience that is inextricably linked to the search engine and its results. The new features take full advantage of the latest technologies developed by the company, particularly the powerful AI model called Shopping Graph, which currently includes more than 35 billion items (up 45 percent from last year’s 24 billion).

In this way, users interested in products will be confronted with a Google interface that allows them to view the item from every angle (including in a dimensional sense, with 3D visualization) and with detailed information to find out if it fits their personal needs and habits, thanks in part to the use of dynamic shopping filters that adapt in real time according to search trends. Specifically, Lilian Rincon (Senior Director of Product, Shopping) presented 9 ways to shop on Google that promise to make the Google shopping experience much easier, intuitive, and fun:

  1. Queries with the word “shop”. Introduced for now in the U.S., this feature gives access to a visual feed of products, search tools, and nearby inventory availability related to that product.

Query con termine shop

  1. Shop the look. A feature that allows the outfit to be completed with the suggestion of complementary products to the one initially searched for, again within Google Search, with directions and information to purchase all while remaining.

Shop the look

  1. Ongoing trends. Coming soon, also in the U.S. preview, is a box that lists trending products within a category to help people discover the most popular items and simplify the discovery of patterns, styles, and brands.

Prootti di tendenza

  1. Shop in 3D. The evolution of this application continues (it is estimated that 3D images attract people almost 50 percent more than static ones), which, after testing for household items, will be made available for shoes as well, showing 3D models thanks to advances in machine learning.

Immagini in 3D

  1. Support for difficult purchases. When the buying decision is complex (e.g., we don’t know the product well, the cost is high, etc.), Google will come to us with a buying guide function where we will find useful information about a category from a wide range of trusted sources, all in one place, to help us uncover essential details to evaluate more carefully.

Guida all'acquisto

  1. Product insights and About this page. Thanks to the insights on the page, it will be possible to get useful context about the web page we are on or the product we are looking for; in addition, we can also enable price tracking to finalize the purchase at the most opportune time.
    Più contesto e informazioni su pagina e prodotto
  1. Personalized results. As mentioned earlier, search results will soon be even more personalized based on the user’s previous shopping habits, with the option of directly communicating one’s preferences to further refine them or turning the feature off.

Risultati personalizzati con About this result

  1. Dynamic filters. Full-page shopping search filters become dynamic and adapt to current trends: already active in the U.S., Japan and India, these functions will soon be available in other regions.

Shopping con filtri dinamici

  1. Shopping su Discover. Google is banking on Discover in the mobile app to throw style suggestions at the user based on previous searches and what other users have searched for: once an interesting item is spotted, simply activate Lens to find out where to buy it.

 

Come evolve la Ricerca Google

“We are going far beyond the search box to create search experiences that work more like our minds and are as multidimensional as we are as people”, said Prabhakar Raghavan, senior vice president at Google, introducing this edition of Search On 22.

All of the features described promise to simplify people’s searches and improve access to information, which becomes more direct through the involvement of new languages and tools: for example, we can search directly through a photo or by voice, but at the same time also get a SERP that is no longer just textual, but increasingly enriched with videos, images and multimedia content that evolve simultaneously with our taps, to create a truly immersive and complete experience.

In short, Google is getting rid of old concepts related to the simple search box by leveraging technological advances to create a platform that adapts to the individual user and helps them “make sense of information” in the ways that are most natural to them, as Raghavan concludes.

 

Featured image from Youtube

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP