Google guidelines, the compass for fair SEO strategies

Put us to the test
Put us to the test!
Analyze your site
Select the database

We frequently mention them, since they represent the compass to which all SEO professionals should refer in order to implement fair optimization tactics that are allowed for the ranking competition on the search engine, but perhaps it is time to make a more specific focus to avoid running into mistakes: so today is the time to talk about the Google guidelines, the document that reports all the SEO techniques and strategies that are in line with the expectations of the search engine and that, therefore, do not expose us to the risk of penalties, manual actions or any violation sanction.

Google’s new guidelines: Google Search Essentials

Around mid-October 2022, Google took another step toward deprecating the term webmaster from all of its official documents (as we recounted two years ago), launching a new page called Google Search Essentials, which groups together the Search Essentials and constitutes “the main parts of what makes your Web-based content (Web pages, images, videos, or other publicly available material that Google finds on the Web) suitable for display and good performance in Search.”

Most of the topics are taken from the previous Google Quality Guidelines and other related existing guidelines, but the content has been rewritten and updated by Google’s Search Quality Team using precise language with more current examples.

Specifically, the new guidelines are divided into three areas:

  • Technical requirements: what Google needs from a web page to show it in Search.
  • Anti-spam policies: the behaviors and tactics that can lead to lower rankings or complete omission from search results.
  • Key best practices: the key elements that can help improve a site’s appearance in search results.

Let us analyze these points individually to understand, then, what are the main aspects on which to focus our attention according to the search engine.

What are the technical requirements of a page required by Google

Technical requirements cover “the bare minimum that Google Search needs from a web page in order to show it in search results,” the guide says; in fact, “there are very few technical things you need to do on a web page, and most sites pass the technical requirements without even realizing it.”

More specifically, “it costs nothing to enter a page into search results, no matter what anyone tries to tell you, because if a page meets the minimum technical requirements it is already eligible to be indexed by Google Search.” These minimum technical requirements stated by Google are three:

  1. Do not block Googlebot. Google only indexes pages on the Web that are publicly accessible and that do not prevent the Googlebot crawler from crawling. If a page is made private, such as by requiring access to view it, Googlebot will not crawl it; likewise, if one of several mechanisms is used to block Google indexing, the page will not be indexed. To view a list of pages that are inaccessible to Google, but that we would like to see in search results instead, we can refer to the Google Search Console tools and, specifically, the Index Coverage Report, Scan Stats Report, and URL Inspection tool (to test a specific page)
  2. The page works and returns a 200 (success) HTTP status code to Google; the client and server error pages are not indexed. We can test the HTTP status code for a specific page with the URL Check tool.
  3. The page has indexable content: this means, in a nutshell, that the textual content is in a file type supported by Google Search (there is a list of all supported formats on this page) and that the content does not violate Google’s anti-spam rules.

However, the document still warns, meeting the requirements does not automatically mean that the page will be indexed, because indexing is not guaranteed.

What are the antispam policies in Google Search

The chapter devoted to the anti-spam policies applied by Google to help protect users and improve the quality of search results is very extensive. As the insight page states, in order to be eligible to appear in Google’s web search results, content of any kind (web pages, images, videos, news content, or other material that Google finds from across the Web) must not violate Google Search’s general rules or the appropriate anti-spam policies, which apply to all web search results, including those from Google properties.

The main and most frequent forms of spam cited by Google (which, however, can act against any type of spam detected) are:

  • Cloaking.
  • Doorways.
  • Hacked content (compromised content, i.e., any content placed on a site without permission due to vulnerabilities in the site’s security).
  • Hidden text and links.
  • Keyword stuffing.
  • Link spam.
  • Machine-generated traffic, i.e., automated traffic.
  • Malware and malicious behavior.
  • Misleading features (leading users to think they are accessing certain content or services that are not actually provided).
  • Scraped content, scraped or copied content.
  • Sneaky redirects.
  • Automatically generated content with spam.
  • Thin affiliate pages, the type of content countered by Product Reviews Update, for example.
  • User-generated spam, which remains the most frequent form of spam detected by Google, according also to the latest Webspam Report.

These are, on closer inspection, a number of instances that fall under the broader tactics of what is called Black Hat SEO, the set of bad practices that could expose the site to manual action by Google.

Then there are other situations that could prompt Google to downgrade in ranking or even remove a page or an entire site from Search, such as:

  • Removal requests for copyright infringement.
  • Online harassment removals.
  • Scam and fraud attempts.

Google detects infringing content and behavior both through automated systems (the SpamBrain algorithm) and, if necessary, through human review that may involve manual action. In addition, if as users we believe that a site is violating Google’s spam rules, we can compile a user report on search quality, which Google will use along with its other scalable and automated solutions to problems to further improve spam detection systems.

What are the main SEO best practices suggested by Google

The last chapter of the Google Search Essentials guidelines focuses on key practices we can apply to improve site SEO, defined as “things that can have the greatest impact on the ranking and appearance of web content on Google Search.”

  • Create content that is useful, trustworthy and people-centric, in the wake of the recent Helpful Content Update.
  • Use keywords that people would use to search for that content and place them in prominent positions on the page, such as the title and main header of a page, and other descriptive positions such as alt text and link anchor text.
  • Make links that can be scanned so that Google can find other pages on the site through the links on the page.
  • Talk about the site to people, i.e., “be active in communities where you can tell like-minded people about your services and products that you mention on your site.”
  • Comply with best practices regarding additional content to text, such as images, videos, structured data and JavaScript, so that Google understands those parts of the page as well.
  • Enable additional features available and in topic with the site to improve the way pages appear on Google Search.
  • Use the appropriate method to control how content is displayed in Google Search if there is content that should not be found in search results or we wish to remove it.

What were the previous Google guidelines

up untile a few weeks ago, the, there was not really such a thing as a single set of Google guidelines, because the search engine has various documents with tips on how to program a site so as to make it suitable for viewing on Google.

In particular, we can rely on four broad categories of guidelines, which apply to all websites that Google has added to its index:

  • General guidelines

best practices that support a site to be viewed on Google and gain visibility positions in SERPs.

  • Content specific guidelines

or additional suggestions related to specific content types on the site, such as images, videos, AMP, AJAX and sites optimized for mobile devices.

  • Quality standards

which describe specific prohibited technical that may lead to the omission of a page or site from the Search results: if these practices are implemented the site may be subject to manual action.

  • Instructions for webmasters

Basic guidelines to achieve sustainable SEO results, using techniques in line with Google’s expectations to avoid an algorithmic penalty or a total manual penalty, which in severe cases can lead to the complete de-indexing of the site from Google SERPs.

Although they are now outdated by the new Google Search Essentials SEO guidelines, it can still be useful to have an overview of how the previous versions of the guidelines worked and what they required (which for simplicity’s sake we will still discuss in the present tense, ed.), to get as many details as possible to avoid potential missteps and future damage to the site, with information coming directly from the search engine anyway.

What are the Google guidelines for webmasters

We begin with the very last category indicated, namely Google’s Webmaster Instructions, which are a collection of general best practices that can facilitate a site’s visibility in Google Search, and quality standards that, if not followed, can cause the page or the entire site to be omitted from Search.
First launched in 2002, for two decades these guidelines have thus defined actions that webmasters can take to improve the indexing or crawling of their websites, while at the same time pointing out procedures that Google considers violations of these instructions, which can result in even heavy penalties.

The Bing search engine also has its own webmaster guidelines, which are roughly based on the same principles as the Google webmaster guidelines.

What the webmaster instructions say

The guidelines are based on three central aspects: webmasters should support Google in crawling (searching and scanning websites), as well as helping Google to classify and “recognize” content (indexing). Third, webmasters should take care of the user’s usability and experience, thus supporting users in the navigation and use of their pages.

In essence, according to the Google webmaster guidelines, a website should help the search engine find pages, understand pages (following all the rules related to content), as well as simplifying the use of pages for visitors ensuring a good user experience. Webmasters must first make sure to design pages for users, not for search engines; avoid tricks to improve the ranking in search engines and, in general, make the site unique, valuable, engaging and valuable, offering quality content that is easy to understand and navigate for people.

Goals easier said than done, because creating websites that fully adhere to the Google Webmaster Guidelines is a challenge, which can only be undertaken by fully understanding the directions (and prohibitions) of Google.

In general, what we can and must do is to comply with the General Instructions to allow Google to find, index and position our site, and subsequently pay attention to the Quality Standards and, therefore, avoid illegal practices, which may involve the complete removal of a site from the Google Index or a manual or algorithmic antispam action on the site.

What is the goal of guidelines

This document serves to clarify to all sites the rules of the game for the ranking on the search engine, specifying the framework of the rules valid for a compliant optimization for search engines: If you follow this established, you can expect that your website will not be penalised and that your content can consistently achieve a good positioning.

The broader purpose of the guides, as noted by Searchmetrics, is in the interest that search engines have to link to “good” sites in search results: the better the sites that are referred to in Serps, in fact, the more satisfied the users are, and in turn this satisfaction increases confidence in the search engine, which can thus increase its reliability even in the eyes of advertising investors.

Main indications of Google guidelines for SEO

Let’s look now at the advice (or guidelines) that Google reserves for webmasters and SEOs, and that allow us to understand what it means to be a quality site in the eyes of the search engine.

The general guidelines ask for:

  • Simplify the structure of URLs as much as possible.
  • Avoid creating duplicate content.
  • Create links that can be scanned.
  • Qualify outbound links for Google (attributing the correct rel tags), also the subject of the recent spam link update, which threatens sanctions against spam links in guest posts and inadequately marked affiliate content.
  • Verify that Googlebot has not been blocked.
  • Coding a site for processing sites or services intended for minors.
  • Ensure compatibility of the browser.

To these are added another set of more specific indications, as highlighted by Brian Harnish on Search Engine Land, which we will now deepen further.

Making resources completely crawlable and indexable

In order to help Google fully understand the content of the site, the guidelines suggest allowing “the scan of all site resources that would significantly affect the display of pages, such as CSS and Javascript files that affect the understanding of pages”.

Google’s indexing system displays web pages as users would see them, including CSS files, Javascript and images.

Webmasters often block CSS and Javascript via robots.txt even for conflicts with other files on the site, while other times they present problems when they are fully rendered: according to Google, however, we must not block these resources because all the elements are essential to ensure that the crawler fully understands the context of the page.

Having pages linked by another page that can be found

Google recommends that all pages of our site receive at least one link from another page (so as to avoid the presence of orphan pages), which can come through the navigation menu, breadcrumbs or contextual links (internal links).

These links should also be crawlable, since it ensures a great user experience and helps Google to easily scan and understand the site; you must also avoid using generic anchor text to create these links, while we can use keyword phrases to describe the outgoing page.

The architecture of the site can help to strengthen the topical relevance of the pages, especially when organized in a hierarchical structure thanks to which Google can better understand them and also improve the knowledge of the topics covered.

Using a reasonable quantity of links

There is no maximum links share per page and, in general, it is not possible to define a specific amount of links to always follow: also in this case, you have to think about the quality of the links and think about the utility for users, without adversely affecting their experience.

Google’s guidelines recommend using a “reasonable” number and indicate that you can also have a few thousand (at most) links on a page (before it was said not to exceed 100); “It is not unreasonable to assume that Google uses amounts of links as a spam signal,” says Harnish.

Creating a useful site, full of information

Google expressly says that in order to rank in SERP you need to create “a site that is useful and rich in information, with pages that clearly and accurately describe the contents of the site”.

This means that we must avoid thin content, thin content not in terms of mere word count, but the quality of the content, depth and extent of the topics covered in our pages, which are the factors that define the value of the pages.

Including keywords that users search for

Again, Google’s guidelines expressly require you to think “about the words that users might type to search for your pages and make sure they are included in the site”.

It is here that an effective keyword research comes into play, which must first consider the intent of the potential customer and his point within the funnel, to try to intercept it with appropriate content.

Once the keyword search phase is over, we must work on the on-page optimization process usually, and make sure (at least) that each page of the site mentions the target query of that same page.

We cannot do SEO without a keyword research and targeting for effective keywords, summarizes Harnish.

Projecting the site with a very clear hierarchy of pages

As already mentioned before, a hierarchical structure – an SEO silo – is the one that can offer the best opportunities in terms of SEO, because it allows you to organize the topics of the site in main topics, under which are located secondary ones and so on.

There are two schools of thought on this indication: the first advises the siloing, and then the creation of a clear hierarchy of conceptual pages that covers in breadth and depth the topic, so as to demonstrate also the specialization in the eyes of the search engine. According to the other theory, you should not stray from a flat architecture, which means that any page should not be more than three clicks away from the home page.

In general, these two theses can coexist, although perhaps the SEO Siloing presents a cohesive organization of pages and topical discussions and therefore is often preferred.

Making visible the important content of the site by default

Google also asks for support to simplify the scanning of content: while being able to scan hidden HTML content within browsing elements such as tabs or expandable sections, the search engine “considers this content to be less accessible to users and therefore considers it appropriate that you make the most important information visible in the default page view”.

In practice, we are advised not to use buttons, tabs and other navigation elements to reveal this content. Even tabbed content (content divided into tabs) is among those less accessible to users, because only the first tab is fully viewable and visible to users until they click on the tab to move to the next and so on, and this for Google is an example of limited accessibility.

The right approach to Google guidelines

These instructions tell what Google asks site owners and efforts to ensure search engine users the highest quality of the results shown in SERP; the two big issues are spam (which is still pretty strong, as the latest Webspam Report 2020 tells us) and poor content, but there are also many other negative aspects from which Google invites us to stay away. Although, the document clarifies, there are other misleading behaviors not listed that can cause consequences for the sites and “it is not advisable to assume that Google approves a page just because it has not used deceptive techniques”.

For those who work in the field of SEO, the guidelines for webmasters of Google must be interpreted, precisely, as guidelines and not necessarily as rules to follow slavishly: it is clear, however, that any behavior to the limit (or even beyond the limit) could result in a penalty or manual action, if Google notices the breach and considers it serious.

However, not all violations of the guidelines involve penalties, because some errors cause problems with scanning and indexing, which can then affect the ranking. If we are in critical situations of this kind, we have available a number of techniques to clean up the site, solve problems and ask Google for a reconsideration, which could restore the status of the site.

And so, ultimately, trying to comply with Google’s guidelines in the SEO strategy should, although with longer times than aggressive and reckless techniques, produce positive results in terms of ranking: this fair approach, furthermore, it can help us build and maintain a more stable online presence in Google Serps and not fear the sword of Damocles of a manual action or an algorithmic devaluation always hanging over our pages.

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP