More and more AI in Search: main announcements at Google Search On 2021

Put us to the test
Put us to the test!
Analyze your site
Select the database

The future of both Google and online searches will be increasingly technological: if already at Google I/O 2021 we had an appetizer of the latest applications of Artificial Intelligence for the search system, with Search On 2021 the Mountain View group went even deeper, revealing how the AI will help users who use the search engine, perfecting its ability to understand needs and formulate answers, but not only.

Google redesigns the search algorithm with MUM

The announcement that attracted the most attention – and that concerns more or less directly also us who work in the field of SEO and search marketing – concerns the redesign of Google Search that is underway thanks to MUMMultitask Unified Model, the powerful algorithm presented in the past months and that is demonstrating great efficiency, offering an intuitive way to explore topics and bring out more content among SERPs.

In more concrete terms, Artificial Intelligence apps allow Google to broaden the old way of searching for answers and introduce a more intuitive way to explore topics, providing a redesigned and much more visual search experience, that will guide users along the paths of the topics.

MUM and multimode search

Since the presentation of MUM, Google had highlighted the really distinctive element of this technology, namely the multimodality applied to the understanding of information and requests of users who launch a query.

In the first trial made public, MUM had correctly identified and in a few seconds 800 variations of the names of vaccines against Covid in 50 different languages, demonstrating its ability in text management. This technology, in fact, examines and interprets the words of the query not in the classic sequential way of the machines (from left to right or vice versa, depending on the language), but in a way similar to human understanding, jumping “between words to better understand and before the meaning of the entire sentence (…), applying at each step a mechanism of self-attention that directly shapes the relationships between all words in a sentence, regardless of their respective position” (DDay).

Innovative and futuristic functions, which however do not contain all the potential of MUM and do not tell its multimodal capabilities: technology, in fact, goes beyond textual understanding and can also manage images and videos, making “jumps” also between different media and tools.

Examples of MUM applications in Search

Users are currently used to search for answers, but thanks to the new evolutions of Search Google will be able to search in different ways and explore more complex activities, also using different means such as images to reach what they need.

It is Prabhakar Raghavan, senior Vice President of the US company, to provide some examples of how Google Search changes thanks to MUM, especially for mobile browsing.

Integrazione tra Lens e Mum per ricerche più interattive

First, with this new feature integrated with Lens we can tap the lens icon while viewing the image of a T-shirt and “ask Google to find the same style, but on another piece of clothing, such as socks”. This helps us when we’re looking for something that might be difficult to accurately describe with just words – staying at the example we see on the page, we could type “white Victorian flower socks”, but we might not find the exact pattern that interests us. Instead, “combining images and text in a single query we simplifies the visual search and the expression of questions in more natural ways”.

La modalità di ricerca

Another application is the “point-and-ask” search mode, which makes it easier to meet a specific need: for example, if you have broken a specific component of the bike (whose name we do not know exactly) and we need a guide to repair that piece, instead of examining in detail all the parts catalogs and then looking for a tutorial, we can frame with Lens the damaged element and launch the query “how to fix”, which Google will understand.

According to the anticipations, this function will be first available on YouTube and then extended to other video platforms; Liz Reid, vice president of the Search department, explained to Italiantech that “MUM does not derive information from the contents of individual frames of a video, but from automatically generated audio transcripts using voice recognition”.

 

Innovative systems to perfect searches

The redesign of the search engine with MUM and the progress of AI goes even further and strays enormously from the old SERPs with the 10 blue links, because Google is about to launch a series of new features in Google Search, that will serve to orient searches to offer faster and more precise answers and make the system more natural and intuitive. They are called:

  • Things to know
  • Broaden/ Refine
  • Visually browsable search results.

Things to know will arrive in the coming months and it is useful to simplify the exploration and understanding of new topics, listing various aspects of the topic sought by the user and allowing users to see other aspects that other people generally look for, helping them to get the information they are looking for faster and orienting the path to the goal.

La feature Things to Know con le altre ricerche degli utenti

For instance, if we want to decorate our home walls and are interested in learning more about acrylic painting, within the SERP for this query we will find the new feature, which lists various angles of the topic as “step by step”, “styles” and “use of household items”, highlighting those that most likely people look at first. According to Raghavan, thanks to MUM it can “identify more than 350 topics related to acrylic painting” and help us find the right way to go, exploring all the searches around that specific query and proposing some collateral paths. In the future, “MUM will unlock more in-depth insights” that we may have difficulty looking for – such as “how to make acrylic paintings with household objects” – and will put users in connection “with content on the Web that they might not otherwise have encountered or searched for”.

Refine and Broaden this search instead apply the concept of zoom in and zoom out to queries, allowing you to intervene on the level of depth of research that we have launched, to refine or expand it.

Feature Broaden e Refine this search

Feature Broaden e Refine this search

Still with the same previous example of acrylic painting, with Refine we can accept Google’s suggestions to narrow the topic to more specific searches (such as Ideas for Acrylic Painting or Online Courses of Acrylic Painting)or, on the contrary, widen the theme to related topics, such as painting styles or famous painters.

Very interesting is also the third feature anticipated by the senior Vice President of Google, which is already in the process of experimentation: this is a new interface that simplifies the search for visual inspirations with a page of results more “visibly browsable“. We are talking about a new visual SERP, designed for searches of users who want inspiration, which is activated for certain queries (especially those that contain the word “idea” in addition to a keyword, such as “Halloween Decoration Ideas” or “Ideas for Vertical Indoor Gardens”) and makes it for the user “easier to navigate visually to find what he is looking for”.

La nuova SERP visiva di Google con risultati di ispirazione

For example, if we look for “ideas with poured paint” (the technique of Puddle Pour), Google gives us a SERP “visually rich in ideas from all over the Web, with articles, images, videos and more” that we can easily scroll through.

More improvements coming soon

MUM applications and AI systems do not only stop at Search. It is always Raghavan to reveal, among other things, that there will be steps forward to identify the key moments in the videos, thanks to the introduction of a new experience that “identifies the related topics in a video, with links to learn more and easily”.

Using MUM, you will also be able to “show related topics that are not explicitly mentioned in the video, based on our advanced understanding of the information in the video”.

For example, even if in the video above the words “lifestory of the macaroni penguin” are not actually spoken, Google systems “understand that the topics contained in the video refer to this topic, such as the way in which macaroni penguins find their relatives and face predators”.

Improvements are also coming for About This Result, the feature that provides details on a website before the user even visits it (including its description, when it was indexed for the first time and if the connection to the site is secure) active in the version of Google USA.

About this Result offre maggiori informazioni sui risultati

The information panel is still expanding – already in July we had seen the addition of some information related to the ranking – and allows:

  • View more information about the source (“what a site says about itself in its own words when that information is available”).
  • Read what others on the Web have said on a site (news, reviews and other useful context): that is, third-party information outside of Wikipedia that talk about the site and the brand.
  • Find out more about the topic: in the section “About the topic”, we can find information such as the coverage of the main news or the results on the same topic from other sources.

Still with the goal of improving the experience of users who want to explore information and read reliable content, Google has launched a number of other tools to help assess the credibility of online information. For example, it will be easier to identify the fact checks published by independent and authoritative sources on the web, which will be highlighted in the results in Search, News and Google Images; and then, work continues to provide the right context to those who conduct an information query, through features that inform when useful or relevant information is not yet available on the Web because the event is evolving rapidly, or because there is simply no information relevant to that search at that time.

MUM and artificial intelligence also applied to shopping

But the Google Search On 2021 event has also acted as a (virtual) set for the anticipations of the arrival of artificial intelligence for shopping and Maps.

On the latter app, in particular, appears a new feature available on iOS and Android worldwide from October that allows you to locate approximately the location and size of a fire in progress, including information on useful numbers to call and how fire containment is developing.

Even purchases get smarter thanks to technology, and the Shopping Graph now allows you to view a more “affordable” Google search experience for clothing related queries, directly proposing from the search results a visual feed of products in various colors and styles (much more navigable than the current one), along with other useful information, such as local stores, style guides and videos.

Always on the shopping theme, Google is adding a new inventory in the store to make the online shopping experience from home more like shopping in person in the store, allowing you to filter out the results by clicking on “in stock” to see if shops nearby have such specific items on their shelves.

News also coming for Google Lens, which will also be implemented on Chrome desktop browser: users will be able to give inverse image searches, selecting images, videos or text on a site to see the search results in the same tab. For mobile devices, however, Lens will make all images on a searchable page. Again, then, Google tries to increase traffic in commercial searches, also trying to move users to Google Shopping to conclude purchases (hijacking them from Amazon).

Search evolves to better satisfy users

“We look forward to help people find the answers they are looking for and to inspiring more questions along the way,” says Raghavan in his article to explain the goals they tend to “the creation of more useful products and the widening of the boundaries of what it means to look for something“.

The livestream event Search On provided Google with the opportunity to share what are the latest arrivals in artificial intelligence applied to the products of the Big G ecosystem, offering people new ways to search and explore information in more natural and intuitive ways.

This is exactly what for Google means to improve the search engine: give users “more natural and intuitive explorable information”, therefore effective and useful, and the interconnection of search results, Google’s multimodality and features guaranteed by MUM can offer new ways to get to what we’re looking for with a query (or even without!).

Today “there is more information accessible at hand than at any time in human history”, but it is not always easy to find satisfaction for our research intentions: thanks to advances in artificial intelligence however it will be possible to transform “radically the way we use such information, with the ability to discover new insights that can help us both in our daily lives and in the ways we are able to face complex global challenges“, Raghavan hopes.

All this work not only helps people around the world as Search users, but also creators, publishers and companies: it is always the VP to remember that Google “every day it sends visitors to more than 100 million different websites and every month it connects people with more than 120 million businesses that do not have websites, allowing telephone calls, street directions and local pedestrian traffic”.

Google Search On 2021, should the SEO panic?

What impact will all this have on the SEO?

Probably, in terms of search optimization activity will not change much, because the more intense use of MUM should not change what Google searches in the content and the criteria with which it establishes the ranking, but only refine the SERPs based on the user’s query – that is, create a more precise match between the user’s request/need and the content that responds to this intent.

More problematic could be the new visual form of SERPs linked to inspirational queries, where the weight of classical textual content seems to take second place to other multimedia tools (element to be taken into account for those who work in this sector)and where the same blue links still lose visibility compared to other more interactive and faster-to-browse features.

As we said some time ago, however, the future of SEO is certainly not in danger and, on the contrary, knowing in (light) advance the transformations that we will live allows us to prepare and adapt (ourselves, mentally, and our content/ sites) to what awaits us. We certainly cannot control Google and its evolutions, but we can work to keep up with it, studying how to take advantage of these changes and new features to get more visibility and generate more business and traffic.

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP