Google Search News, Search updates of January 2021

Put us to the test
Put us to the test!
Analyze your site
Select the database

Again, Google started the year at full throttle: of course, there was no core update as in 2020, but in January 2021 the work on the search engine and the ecosystem around Search has not stopped, and many are the news announced in recent weeks by the American company. As tradition has it, it is John Mueller with his Google Search News to summarize what happened last month, with a video in which he talks about crawling, indexing, link building and much more.

Google Search News, latest news of January 2021

After the traditional introduction of Mueller – who recalls that, since last May, he is still recording from home as a result of the pandemic, joking with the usual irony about the difficulties of life nowadays – we move to the summary of the episode on January 2021.

There are three main topics covered in the video, all related to some basic processes and concepts of the Search system, namely crawling, indexing and links.

Il sommario dell'episodio di Google Search News gennaio 21

New tools to understand crawling: the Crawl Stats Report

The term crawling refers to ” Googlebot‘s scanning activities on web pages by following the links it sees to find other web pages”; indexing, however, is “when Google systems try to elaborate and understand the content of those pages”. Both of these processes need to work together “and the barrier between them can sometimes be a bit confusing,” says Google’s search advocate.

Although Google has been crawling for decades, “we are always committed to making it easier, faster or better understandable” for site owners. It goes in this direction a recent novelty, the introduction in Search Console of an updated Crawl Stats report, which provides information on how Googlebot scans the site.

The report shows data such as the number of requests per status code and the purposes of the scan, host-level information on accessibility, examples and more; some of this information is also found in the access logs of a server, but getting them and understanding them is often difficult.

Google’s intent is to make it easier for sites of all sizes to get useful insights into Googlebot’s habits.

More data and info on Googlebot’s scans

Along with this tool, Google has also launched a new guide dedicated to managing the crawl budget for large websites: when a site grows, in fact, scanning can become more difficult and it can be useful to have a reference with the best practices recommended directly by Google (which can give interesting insights even to sites of smaller size, of course).

And finally, again on the subject of crawling, Mueller recalls that Googlebot has started scanning with HTTP/2, the updated version of the protocol used to access web pages that makes some improvements particularly relevant to browsers, and then chosen by Google to “improve our normal crawling”.

News for the indexing

Moving on to indexing – defined the “process of understanding and storing the content of web pages so that you can display them in search results appropriately” – there are two news that Mueller shares in the video.

The first is the reinstatement of the URL inspection tool functionality in Search Console (after a suspension of a few months), which then makes it possible again to manually send individual pages to request indexing in appropriate cases. In fact, explains the Search Advocate, in most cases “sites should not need to use these systems, because just focus on providing a good internal linking and good sitemap files”, sufficient condition for “Google systems to be able to scan and index the content of the site quickly and automatically”.

The other news still concerns the Search Console, where there is now a Report on the Index coverage status that has been significantly updated: now site owners are facilitated in understanding the problems that affect the indexing of the content of their site and, for example, the previous generic “crawling anomaly” signal has been replaced with the more specific indication of the type of error.

Opportunities with links

As last point mentioned in the summary, the Googler talks about links, which along with “a lot of different factors in the search” serve to Google “to find new pages and to better understand their context in the web”: they are therefore an integral part of the web and is “reasonable for sites to think about it”.

Mueller recalls that Google’s guidelines mention various “things to avoid regarding links, how to buy them”, but also that there are many questions coming from the community about “what can sites do to attract links“.

The Search Advocate explicitly cites a “fascinating article by Gisele Navarro on the best link building campaigns of 2020″ and, while not being able to “endorse any particular company that has worked on these campaigns, I thought they were ideal examples of what sites can do” and then ideas to “think about some creative things that you might be able to do in the niche of your site”.

Anyway, the official basic advice is inevitably to create fantastic content: a job that is not always easy, but that “can help you reach a wider audience and, who knows, maybe get a link or two”.

Structured data, the testing tool will not disappear

In the end, Mueller presents the latest news on structured data and, in detail, the Structured Data Testing tool that, as already mentioned during Google Search News in July 2020, Google has decided to deprecate the development of the more modern and specific Rich Results test.

The news is that the old tool will not disappear altogether, because “it will find a new home in the schema.org community”.

Also, still on the subject of structured data, it is worth remembering that since yesterday, January 31, 2021, Google no longer supports the data-vocabulary.org markup to generate rich results: the announcement of this choice arrived last year and, initially, foresaw the farewell from April 2020, but the company preferred to leave more time to webmasters because of the pandemic.

This decision does not prevent sites from using the data-vocabulary markup, which remains active and valid regardless of Google and can still serve other purposes, other than the rich results on the search engine.

Call to action

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP