Google’s manual actions: what they are and how to correct them

Put us to the test
Put us to the test!
Analyze your site
Select the database

For those who manage a site, there are few things less destabilizing than receiving a manual action notification from Google. That message in Search Console is like a sudden alarm bell that abruptly interrupts the routine and warns us that something has gone wrong, often causing an immediate consequence almost simultaneously: a drop in SERP rankings, a drastic loss of traffic and, in the worst cases, the exclusion of the affected pages from Google’s index. We need to understand what exactly a manual action is, however, because it is not a typical drop following an algorithmic update such as the famous Core Updates nor is it technically a penalty “like those of yesteryear,” but a deliberate and targeted action by Google’s Search Quality team. When an auditor identifies violations to Google Search Essentials-such as spam, unnatural links, or low-value content-they can choose to penalize either specific pages or the entire domain by reporting the problem in detail. The good news is that manual actions do not represent a final condemnation, but we have the opportunity to clean up the site, realign to Google’s standards and potentially return to SERPs with even better visibility than before. In this guide we will go step-by-step to discover what manual actions are, analyze the most common types of violations, explore the tools to identify and correct them , and find out how to prevent future penalties, because addressing manual actions with awareness is the first step toward a stronger, more compliant and competitive site.

What are Google manual actions

Google manual actions are specific interventions applied by the Search Quality team to penalize sites that violate the rules set forth in Google Search Essentials, the official Google Search guidelines.

Keep your site under control
Analyze and monitor your pages and ensure the digital health of your project: fluctuations in rankings and traffic could be alarm bells!
Registrazione

These are deliberate sanctions that arise from human scrutiny conducted by professional reviewers, charged with detecting manipulative practices or content detrimental to the user experience. The goal of these actions is to restore balance in search results, ensuring that only compliant and valuable sites can maintain their visibility.

Basically, a human being has examined our site and decided that it deserves a penalty, so as to overcome the possibility that there are too many websites that go through the algorithms even though they do not meet Google’s quality standards.

Unlike declines resulting from algorithmic updates, which occur automatically and without specific notifications, manual actions are in fact a targeted and specific tool , designed to deal with situations where a site does not meet the rules set by Google Search Essentials. More importantly, they are communicated directly through the Search Console. In this way, Google not only reports the presence of a violation, but also gives us the tools and guidance to correct it with targeted actions and start the review process to recover lost visibility.

Manual enforcement for unfair practices or non-compliant content has consequences that may vary depending on the severity of the violation. In some cases only individual pages may be affected , but in others it may be the entire domain that suffers total deindexing.

In each scenario, the impact is immediate and often devastating: plummeting positions in search results, loss of organic traffic and, consequently, failure of business goals. In addition to the operational consequences, however, there is also reputational damage: a penalized site may be perceived as untrustworthy, both by users and search engines.

What manual actions are for and why Google applies them

It has always been Google Search ‘s mission to provide users with accurate, relevant and useful answers, ensuring that search results are reliable and actually meet their needs. However, achieving this promise requires constant effort to curb attempts at ranking manipulation and to eliminate unsuitable or low-quality content. This is where manual actions come in .

In fact, according to Google’s official guide, manual actions were introduced to ensure that websites comply with the search engine’s guidelines, to ensure fair competition among websites, and to protect users from harmful or misleading content – which is why in most cases they aim precisely to curb attempts to manipulate the search index.

Since the origins of SEO, in fact, there have been actors who have tried to climb the digital heights (or simply the search engine rankings) using fraudulent methods : these practices, historically called black hat SEO, not only bury the most relevant content under a mountain of irrelevant results, but also make it harder for legitimate websites to get noticed.

And so, manual actions primarily serve two distinct but complementary goals. On the one hand, they protect the user experience, ensuring that those using Google encounter authentic and useful content, free of manipulative schemes or ethically questionable practices. On the other, they ensure integrity, fairness and transparency in the system, preventing sites that engage in non-compliant behavior from outperforming legitimate competitors that follow the rules.

To do this, Google takes a data-driven approach and relies on the work of analysts , researchers and statistical experts, who constantly assess the quality of search and also collaborate on appropriate changes to Google’s algorithm, which undergo rigorous quality control before being implemented to ensure, precisely, that search results are always as relevant and useful as possible .

When a site violates fundamental rules, such as the use of spam techniques or the publication of misleading content, a member of Google’s quality team can take direct action to restore the balance of the SERPs. The application of manual action is thus a clear statement, but it must be said that this system is not intended to punish a priori, but rather to educate those who manage sites and stimulate them to improve their approach, to align with standards capable of delivering benefits for both users and their own digital project.

And while algorithms today are remarkably effective at automatically detecting and removing spam from search results, there are times when more direct intervention is needed, entrusted precisely to human controllers whose job it is to take specific steps to remove spam or overtly low-quality content from search results, thereby protecting the integrity of the index.

Why it is critical to address manual actions with urgency

Prompt action can make all the difference when faced with a manual action notification.

Ignoring the problem or delaying its resolution means accepting a progressive deterioration in site performance, which can have irreversible consequences in the long term. Every day that the site remains penalized is equivalent to a loss of traffic, ranking, and credibility. Considering that organic traffic is a major source of visibility for many digital businesses, acting promptly is essential to contain the damage.

Google offers clear tools and detailed messages to identify the nature of the penalty and allows direct action to be initiated to resolve the problem. This process, to be effective, requires a thorough analysis of reported violations, their correction in a methodical manner, and the preparation of a well-documented request for reconsideration. Only through a proactive and systematic approach does it become possible to recover lost ground and restore Google’s trust in our site.

In addition, the process of reviewing and resolving manual actions also becomes an opportunity to strengthen the overall quality of the site and often leads to improved overall SEO practices, creating a more robust project that meets long-term standards.

The main differences between manual actions, penalties, and algorithmic drops

The first step to positive problem removal is awareness, and so it is important to understand the difference between a manual Google action, a penalty, and an algorithmic drop. These terms may sound similar and manifest themselves in a loss of traffic and visibility for the site, but they actually have very different dynamics and implications and require distinct approaches to diagnosis and resolution.

Manual actions are applied directly by a human reviewer when Google detects a specific violation of its guidelines. These actions are explicitly communicated through the Search Console, where Google describes the type of problem, including pointing to concrete examples of affected pages. This means that we are able to know exactly what critical areas need to be corrected and can take precise action to resolve them.

A Google penalty is an algorithmic intervention that causes removal from the Search Engine Index for the individual page or, in the worst cases, the entire site that was guilty of a violation. As we know, algorithms rely on a series of rules and calculations to automatically and quickly provide the result desired by the user who has entered a query; Google’s goal is to ensure that the answers are useful and relevant, so as to satisfy the search intent and provide a positive user experience.

For example, in the case of the celebrated Panda and Penguin algorithmic updates, Google’s ultimate goal was to demote websites in search results that did not meet its quality standards, as defined by the Webmaster Guidelines. Today, algorithmic penalties are rare, replaced (so to speak) by manual actions.

An algorithmic change, on the other hand, occurs automatically when new rules are introduced or ranking criteria are updated, as in the case of Google’s periodic core updates. In these cases, Google does not send direct notifications or indicate specific violations. Sites affected by an algorithmic drop do not necessarily violate the guidelines or have engaged in bad behavior, but they may lose visibility because they no longer respond effectively to the new ranking preferences implemented by Google or because there is a competitor that has done a better job.

Having clarified the theoretical aspects, the practical side remains: that is, in all cases we are faced with a drop in site traffic. This is where analytical skills come into play, because obviously each situation requires a distinct corrective strategy, and confusing a loss of page ranking caused by a manual action with a slump following an update can lead to embarking on a wrong and ineffective recovery strategy.

Basically, the presence of a notification in the Search Console confirms a manual action, while the absence of communication indicates that the problem probably depends on an algorithmic update. In addition, SEOZoom’s tools help us map the dates of core updates and understand whether there are temporal correlations between traffic declines and changes in overall ranking.

In the case of the algorithm, the site has not “done something wrong,” but its content is no longer considered by Google to be as useful and relevant as before (due to changed and inadequately satisfied search intent, emergence of better competitors, worsening page load times, and so on); recovering the positions, visibility, and traffic lost as a result of a Google update is also slow and sometimes complicated, because it can call into question so many different factors.

A website penalized by Google, on the other hand, has been responsible – more or less knowingly – for an explicit violation of the search engine’s guidelines, which is reported with a message in the specific manual actions report in Google Search Console.

In this case, the strategy to clean up the site must follow a very specific path, which also includes the need and opportunity to interact directly with Google – which is not the case with algorithmic drops – through the “reconsideration process” that we can activate after resolving the detected violation, with which we are invited to explain the origins of the problem and the interventions applied for resolution.

Differences between the Manual Actions report and the Security Issues report

There is one more distinction to clarify, however, which concerns Google Search Console, where we find the Manual Actions and Security Issues reports , which both aim to report critical issues related to the site, but differ in the nature and scope of the problems detected.

The Manual Actions report focuses on violations of ranking-related guidelines, such as attempts to manipulate the index (e.g., cloaking or unnatural links). These problems are not necessarily harmful to users, but they negatively affect a site’s ranking or may lead to its exclusion from search results.

Otherwise, the Security Issues report points out direct risks to users, such as sites compromised by malware, phishing attacks, or unwanted software. These problems can result in visible warnings being added to search results or browsers blocking access, affecting both user safety and site reputation.

What are Google’s manual actions: the full list

Google’s manual actions represent selective and precise tools used to censor non-compliant behavior that undermines the integrity of search results.

Below we find the complete and detailed list of reasons that can lead to a manual penalty, based on Google’s official documentation. Each point includes an in-depth explanation to help better understand the problem, the effects it generates on the affected site, and the main steps to resolve the situation.

  1. Cloaking or disallowed redirect commands

Cloaking is the practice of showing users a completely different version of the page than the one shown to Google crawlers. This behavior, often used to fool the search engine with over-optimized content, seriously violates Google Search Essentials guidelines: it represents a serious attempt to manipulate ranking and can affect only parts of the site (partial matching) or the entire domain (site-wide matching).

Effects. The penalty can affect individual pages or the entire domain, causing heavy loss of ranking or complete deindexing from SERPs.

Correction. The recommended path to correction consists of several steps:

  • Analysis of problem pages. Through Search Console URL Control, we can retrieve the affected pages of the site.
  • Content Comparison. We compare what is returned to Google with what is viewed by a human user while browsing the site. If we detect discrepancies, we need to isolate and identify the sections of the site that generate these discrepancies.
  • Elimination of discrepancies. We remove the parts of the site that show different content to Google than what is visible to users by analyzing the code directly on the server.
  • Checking for redirects present. We check for URLs that redirect users to unexpected destinations.
  • Correction of conditional redirects. We identify and remove any redirects that work only under certain conditions, such as the origin of traffic (e.g., from Google Search or specific IPs).
  • Analysis of scripts and configuration files. To identify the causes of unauthorized redirects, we check JavaScript scripts, the .htaccess file, the CMS used, and any plugins installed.
  • Request for reconsideration. Once we have modified the site and verified that it meets Google’s guidelines, we send a reconsideration request from Search Console, explaining in detail the actions taken.
  • Waiting for reconsideration. We wait for Google to complete its review of the changes. We receive a notification in Search Console with the outcome of the review, confirming the possible removal of the penalty.
  1. Redirect commands not allowed on mobile devices

This specific problem, also called “sneakyredirects” occurs when only users browsing from mobile devices are redirected to pages that are not visible to Google crawlers. Often related to scripts or misconfigurations, this type of redirect can occur intentionally, but also due to the insertion of problematic code by advertising partners or service providers.

Not all mobile user redirects are a violation, and there are cases where Google allows the mobile version of a site to display content slightly differently from the desktop version. The most immediate example is images, which often need to be modified to fit a smaller screen, but there are also cases where it is necessary to redirect mobile users from one URL to another for a better user experience.

One requirement must be met : as long as it sends the user to a page that is essentially the same, this is a perfectly legitimate use of a redirect.

Conversely, if mobile users are surreptitiously redirected to different content, a bad user experience occurs and a penalty is incurred.

Esempi di redirect leciti e illeciti

Causes. Sneaky redirects from mobile devices are often unintentional and can occur without the direct knowledge of the webmaster. This commonly occurs when:

  • Code is added that creates redirect rules for mobile users.
  • A script or element is added to display ads and monetize by redirecting mobile device users.
  • Hackers add a script or element that redirects mobile users to a malicious site.

Precisely because they are frequently “unintentional,” it is a good idea to check for sneaky redirects and then proactively monitor mobile versions of site pages that contain code or script elements that redirect users in the URL checking tool so as to escape penalties.

Correction. The revision process for this problem requires a thorough analysis, which begins with determining whether the presence of such disallowed redirects is intentional or not. To correct sneaky redirects that are not intentional, we must first check the Search Console Security Issues report to see if the website has been hacked, and then review all scripts and third-party elements on the pages.

If the site has not been hacked, the next step is to investigate whether any scripts or third-party elements are causing the problem by following these steps:

  • Remove individually, one by one, any scripts or third-party elements over which we have no control.
  • Visit the site from a mobile device or emulator to see if the redirection is broken.
  • Remove the particular script or element from the site that we believe is responsible for the devious redirection. If that script is important, let’s work on debugging the problem, then reinstall it and verify that it is working properly.

Obviously different is the situation where we are intentionally engaging in sneaky redirects, because then the correction process begins by making the necessary changes to comply with Google’s guidelines, specifically:

  • Confirm compliance by checking the site from a mobile device or emulator.
  • After making the necessary changes and completing the check, request a review by describing the problem sincerely, explaining how the error occurred and what specific actions were taken to prevent a recurrence.
  • Check the Search Console account, where Google will send a note that a review of the site has been performed. Assuming that the site no longer violates the guidelines, the manual action will be lifted.

Google also suggests a path to avoid and prevent running into impermissible mobile redirects. Preventive measures that improve user experience and ensure transparency in traffic are:

  • Testing the site on mobile devices. We regularly visit our site from a smartphone or use emulators to check for unintended redirects. Tools such as Chrome DevTools or similar features in Firefox and Safari can help us simulate site behavior on multiple devices.
  • Listening to user reports. User feedback can reveal problems that we had not identified. Complaints related to mobile browsing, such as redirects to irrelevant content, require immediate insights.
  • Constant monitoring with analytical data. We use tools such as Google Analytics to examine mobile user activity. Sudden fluctuations in average time spent on the site or a significant reduction in access from mobile devices may indicate redirect issues that need to be investigated.
  • Collaboration with reliable advertising providers. We choose transparent advertisers that properly manage redirects and user traffic. Industry best practices, such as those of the Trustworthy Accountability Group, are an excellent reference for ensuring that ad campaigns meet quality guidelines.

Effects. Google considers these redirects a negative experience for mobile users and may apply a removal of the involved URLs from the search index.

  1. Compromised images

Image cloaking also exists, and Google penalizes pages in which the images shown in search results differ from those actually visible to users on the page, involving, for example, obscured images and thumbnails that do not match what the user is looking for. Representing forms of image cloaking are showing images on Google that are obscured by another image, such as a text blocking an image, or showing images on Google that are different from those shown to a page visitor.

To prevent an image from being shown at original size in Google’s search results, we can disable the embedded link by examining the HTTP header of the request and configuring the server to respond with HTTP code 200 or 204 without providing content to requests from a Google domain. In this way, Google will continue to scan the page and detect the image, but in the search results it will show only a thumbnail generated during the scan. This solution does not require changes to the site images and is fully compliant with Google’s rules, so it is not considered cloaking nor does it pose any risk of penalties.

Effects. Infringing images are excluded from Google Images results , losing visibility and traffic opportunities.

Correction. To correct a compromised image situation, simply verify that you are showing the exact same image to Google and site users. If we have committed this violation, then we need to fix the problem and proceed with the reconsideration request.

  1. Non-natural links pointing to the site

Google strongly opposes buying backlinks or participating in link schemes aimed at improving the site’s ranking. Non-organic links , such as those created artificially or through automated programs, are considered a direct violation of Google’s rules.

Effects. Detection of a “non-natural, artificial, deceptive or fraudulent” link pattern is a violation of the spam rules and may result in the application of manual action to the site, on a partial or full basis.

Correction. If we suffer manual action due to unnatural links to the site or if we find that the backlink profile does not appear natural, we can take action by following the process recommended by Google:

  • Download all links to the site from Google Search Console, sorted by host name (Sites with Primary Links > Export) or in chronological order (Links report > Export External Links > Most Recent Links).
  • Check links to identify those that might violate Google’s guidelines. If the list is extensive, we look first for sites that include multiple links to our site or recently created links
  • Contact the site owner to request that they remove non-compliant links or prevent links from increasing PageRank, such as by adding a rel=”nofollow ” attribute or a more specific attribute.
  • Reject with the link disavow tool all links that we cannot remove or put in nofollow. When using the Disavow Link Tool, it is important to follow a few simple rules: first, we try to manually remove problematic backlinks, because just using the rejection file may not be enough for Google. If several links come from the same domain, we can use the “domain:” operator to simplify the process; however, let’s make sure we don’t include organic and quality links by mistake, because they could hurt our overall profile. Finally, indiscriminately rejecting all backlinks, without trying to contact the owners to remove them, could lead Google to reject our reconsideration request.
  • Send a reconsideration request after cleaning up the backlink profile. Include documentation on the removed links and an explanation for all links that could not be removed.
  • After the site review is complete, we will receive a notification in GSC: if Google determines that our site no longer violates the spam rules, the manual action will be lifted.
  1. Non-natural links departing from the site

Google does not like (understatement) to find a site that sells links or participates in massive backlink schemes without applying proper link attributes. These links are used to manipulate PageRank and compromise the overall quality of SERPs.

Effects. If Google identifies “a pattern of artificial, unnatural, deceptive, or fraudulent outbound links” with the intent to manipulate PageRank it may proceed with a sanction at the page level or affecting the entire domain.

Correction. The process to fix the problem begins by checking all links originating from the site and identifying those that are paid links or apparently violate Google’s rules regarding spam links (e.g., excessive link exchanges are found). The correction then proceeds as follows:

  • Remove the unnatural links, add a nofollow attribute, or redirect them via a page blocked by the robots.txt file.
  • Send a reconsideration request after cleaning up the backlink profile, providing examples of bad quality content we removed and good quality content we added.
  • After the site review is complete, we will receive a notification in GSC: if Google determines that our site no longer violates the spam rules, the manual action will be lifted.
  1. Violations in structured data

Structured data, such as Schema.org markups, must accurately reflect the actual content of the page. If they are used in a misleading way, such as declaring nonexistent reviews, promoting invalid job postings, or other manipulative behavior, Google may apply a penalty.

The list of possible issues with structured data includes:

  • Page content other than structured data, such as structured data JobPosting found on pages with no job postings.
  • Unable to apply on the job offer page when content other than the relevant structured data was found on the pages.
  • Structured data not matching content, when content other than related structured data was detected on pages.
  • Paid application, when pages with JobPosting structured data charge a fee for submitting the application.
  • Job application found on pages related to job vacancies, when pages with JobPosting structured data are about the job search, not the job offer.
  • Job offer author not hiring, when pages with JobPosting structured data collect applications without offering an actual position.
  • Problems with structured data on a page containing a list, when a single structured data element aggregates data from multiple elements: Google’s anti-spam rules require that on pages with a list of elements, each must have individual markup.
  • JobPosting structured data on a job offer page, when the JobPosting markup used for an expired job offer appears without the validThrough property set in the past…
  • Page content other than structured data, such as ClaimReview structured data in pages that do not contain a statement check.
  • Missing reference for ClaimReview, or reference that does not match the page outcome, when pages with ClaimReview structured data do not include a supporting source or reference.
  • Structured data found on content that is hidden and not visible to the user.
  • No mechanism to submit new reviews: If a page includes a review, it must also provide a method to write reviews or clearly show the source.
  • Company marked as product in structured data.
  • Generic item or item not corresponding to a product labeled as product.
  • Review written by the site or person offering the service: Reviews should not be written or provided by the business or content provider, unless they are customer reviews, independent and editorial not paid for.
  • Structured data about theevent is in fact a promotion, if the visible text or description of the structured data is more aimed at promoting or selling the event, than describing it.
  • Element not corresponding to an event (e.g., a vacation or coupon) labeled as an event.
  • Element not corresponding to a recipe labeled as such: a recipe must relate to a food and include both ingredients and steps.
  • Violation of structured data rules for one or more pages.
  • Incorrect employer: the employer in the hiring Organization field must match the employer listed in the offer.
  • Incomplete or incomprehensible job description.

Effects. Site loses access to enriched features in search results, such as rich snippets.

Correction. If we want to take advantage of structured data opportunities to enable enriched features in Google search results, we need to adhere to specific instructions and check how the markup works. In particular, it is necessary to:

  • Update existing markup and remove any markup that violates Google’s guidelines.
  • After making these changes, send a request for reconsideration and wait for the outcome of the review process.
  1. Scanty or low-value content

As we know, the presence on the site of sparse or thin content-which provides little or no added value for users-is a problem that can negatively affect ranking. In particular, Google penalizes pages lacking useful information, such as affiliate pages with no added value or hosting content from other sources (content from other sites or low-quality guest blog posts), but also the famous doorway pages, designed solely to rank on specific queries and redirect users elsewhere.

Effects. Again, the penalty is removal from the index, which can affect partially (only a certain number of pages affected by the problem) or the entire domain.

Correction. To solve this problem we must, first of all, identify the type of violation present on our pages, and then act to correct it. Our goal should always be to invest time and resources in creating unique and useful content, as required by Google.

  • Check our site for content that replicates content found elsewhere, or pages with sparse content and affiliate links, or doorway pages
  • Honestly assess whether the site actually offers useful content, possibly even asking friends or family members (real people not affiliated with the site) to use or objectively review the site to derive ideas for improving it.
  • Improve the website so that it adds significant value to users.
  • Send a request for reconsideration after resolving these issues, giving examples of poor quality content we removed and good quality content we added. Wait for the outcome of the review.
  1. Discrepancy between AMP content and canonical page

The content of the AMP version and that of the related canonical web page must be “essentially” the same: this does not mean that the text must necessarily be identical, but that there must be correspondence of topic and user actions (who must be able to perform the same operations in both versions).

Effects. If AMP users find significantly different content than in the other version, the AMP pages are excluded from Search and replaced by the respective canonical pages.

Correction. To avoid this penalty, it is good to be proactive and always check in advance that the AMP and canonical versions of a page match. The correction work if we suffer a manual action for AMP page content mismatch consists of five steps:

  • Verify that the AMP page is associated with the correct canonical page.
  • Use GSC’s URL Control tool to confirm that Google and users see the page the same way on both the AMP page and the canonical page, sometimes, in fact, the discrepancy can be caused by a robots.txt file blocking resources on one page or the other.
  • After “harmonizing” the AMP and canonical pages, we request review through Search Console.
  • We monitor the Search Console account, because this is where Google will inform us that the site review has been performed, and if everything has been well executed and we no longer violate the guidelines, the manual action will be lifted.
  1. Hidden text or keyword abuse (keyword stuffing).

Violations also include practices (which by now should be) obsolete, such as hiding text behind images or using identical color and background or stuffing content with excessive keywords.

Effects. Affected pages immediately lose rankings and may be removed.

Correction. The path also depends on the amount of site affected by the problem-that is, whether it affects a limited number of pages or the entire domain. Either way:

  • We check with the URL Check tool if content is present that is visible to our crawler, but not to users visiting the site.
  • We look for the presence of text of the same or similar color as the page background. We can often detect such text by selecting all the text on the page, such as by pressing Ctrl + A or Command + A.
  • We look for hidden text using CSS styles or positioning.
  • We remove or modify the style of any hidden text so that it is detectable even to the human user.
  • We correct or remove any paragraph of repeated words without context.
  • We correct <title> tags and alternate text that contain strings of repeated words.
  • We remove any other instances of excess keywords.
  • After fixing the problem on all pages, we send a request for reconsideration and await the outcome of the review.
  1. Severe spam problems

This is one of the most severe penalties, applied to sites that use aggressive manipulative techniques, such as extreme cloaking, scraping content from other sites without adding value, large-scale content abuse, or other serious or repeated violations. Unlike other penalties, in the case of pure spam, unconscionability cannot be invoked.

Effects. The site is classified entirely as spam and removed from the index very quickly.

Correction. Google is very clear towards such situations: to correct the problem, we must clean up the entire site and comply with Google’s guidelines. According to experts, receiving the first penalty still allows us to “save” the domain by removing all malpractices and requesting a reconsideration review, in which we report to Google examples of bad quality content we removed and good quality content we added. If Google determines that indeed the pages no longer violate the instructions, it will revoke the manual action. In case of a “relapse” into the same penalty, however, it becomes highly unlikely that Google will offer us another chance after breaking its trust again, and so we would even be better off “shutting down and starting over” from scratch with a new site/domain.

  1. User-generated spam

Typically, user-generated spam is found in forum and guestbook pages, blog comments, or user profiles, i.e., those pages that allow users to create or comment on content without moderation. Google detects spam in comments, posts, signatures, or usernames that promote products, suspicious links, or irrelevant content.

Effects. Pages affected by the action may be removed from the index, leading to a loss of traffic and visibility. It is essential to monitor and moderate user-generated content.

Correction. Again, it pays to be proactive and look for critical situations on the site before they escalate into manual action. The page cleanup process consists of these steps:

  • Identify pages where users can add content.
  • Look for obvious signs of spam, such as posts or profiles that look like ads; posts or profiles with out-of-context or off-topic links; posts or profiles with commercial usernames (names like “Discount Insurance” that don’t sound like real people’s names) that link to unrelated sites; posts or profiles that appear to have been automatically generated (not written by a real user).
  • Search the site for unexpected or spammy content by using the site: operator in Google Search and adding commercial or adult keywords that are unrelated to the topic of the site.
  • Remove all spam and inappropriate content.
  • Consider implementing measures to prevent the inclusion of user-generated spam, such as content moderation.
  • When the site is clean and no longer in violation, request a review from Google and wait for the process to conclude.
  1. Site being abused with third-party spam

Google detects that some sections of the site are being illicitly used to host spam content of little or no value, generated by site visitors or third parties. This content may include irrelevant text, irregular links or promotional information placed on interactive sections such as forums, guestbooks, social media platforms, file uploaders, free hosting services or internal search pages.

Effects. This manual action has an effect only on pages with spam content. In contrast to the case of pure spam, the “good news” is that this action implies that Google still considers the site to be of such quality that it does not initiate site-wide penalties. However, if the problem extends to a significant portion of the site, it can adversely affect the user experience and overall domain reputation and performance.

Correction. We must first identify and rectify site violations by researching pages where users, visitors, or other third parties might be adding content or interacting with, such as forums, guestbooks, social media platforms, file uploaders, free hosting services, or internal search pages to which users may submit queries. The correction process then consists of several steps:

  • Examine sample URLs in messages received in Search Console or via email to better understand where content with spam appears .
  • Use the “site: ” operator with commercial or adult keywords unrelated to the site topic to search for any unexpected or spam content-for example, [site:name-your-domain viagra] or [site:name-your-domain watch free movies online] to detect irrelevant content. We need to check, for example, for items such as out-of-context text or off-topic links with the sole purpose of promoting a third-party website or service (e.g., “Download free movies” or “watch online”); nonsensical content; or text that appears to be automatically generated; information or comments posted by users with unrealistic, commercial-sounding keywords (e.g., usernames such as “Discount Insurance” that do not appear to be real usernames and link to unrelated sites); internal search results in which the user’s query appears aimed at promoting a third-party website or service.
  • Monitor web server log files for unusual or unexplained spikes in traffic, especially for recently created pages. We also check for the presence of any keyword-pattern URLs that are completely irrelevant to the website, also using the performance report for Search to show the pages with the most clicks in Google Search.
  • Remove any inappropriate content and block the publication of clearly inappropriate third-party content on the platform with a list of spam terms, such as terms related to streaming, downloads, adult content, gaming and betting, or pharmaceuticals.
  • Consider grouping interactive content into a single file path to simplify maintenance and spam detection.
  • When the site is clean and no longer in violation, request a review from Google and wait for the process to conclude. In the meantime, continue to actively monitor and eliminate spam content, improve the site system, and fix system vulnerability to prevent this type of spam in the future.
  1. Free host containing spam

Google’s guidance is stark: a significant proportion of sites hosted on free web hosting services contain spam. In fact, we can say that there is no such thing as free hosting for real because often what we save on costs for the service is likely to be undone in terms of performance, with substandard reliability and spam ads that we cannot control.

Effects. In addition to penalizing the individual site, Google deals harshly with this problem and reserves the right to take manual action on the entire web hosting service if it finds that a significant portion of its pages contain spam-and, therefore, other domains hosted by the server may also lose visibility, regardless of whether they are compromised or not.

Correction. For Google, correcting this manual action is done by analyzing the illicit uses of the service, removing any existing accounts containing spam from the service, and contacting the hosting service’s technical support team to make them aware of the manual action. At this point, we can request a Google review.

To be even safer, it might be better to directly migrate to a different hosting, making a request for reconsideration once the process is complete.

  1. Google News and Discover violations

There are currently 16 different specific penalties for sites that publish fraudulent content, clickbait or non-transparent information in the Google News and Discover contexts.

More specifically, these illegal behaviors that do not comply with Google’s specific rules concern:

  • Dangerous content, which could directly facilitate serious and immediate harm to people or animals.
  • Coordinated deceptive practices, i.e., pages or sites that conceal or misrepresent their identity, ownership, origin or purpose, including undisclosed financial or editorial relationships. Google penalizes content that confuses users, e.g., omitting country of origin information, falsifying editorial independence, omitting statements of relationships with political and economic interests.
  • Deceptive practices – good neighbor norms: content that steals the identity of other organizations or conceals information about the entity that originated it. Examples include concealing the country of origin, content targeting users in other countries with false premises, and misleading information about relationships or editorial independence.
  • Deceptive practices-identity theft: pages or sites pretending to be someone else, generating confusion in users and misrepresenting attribution. The satirical nature of information, for example, must also be explicitly stated to avoid running into this penalty.
  • Deceptive practices – misrepresentation of affiliation: violates the rules those who conceal or misrepresent significant editorial or financial relationships with other organizations or governments.
  • Deceptive practices – misrepresentation of location: content that conceals or misrepresents the country or location of origin, or engages in conduct that is inauthentic or designed to deceive, cheat or mislead.
  • Harassing content : content that configures harassment, threats or incitement to injurious behavior toward people. This also includes posting private information that could be used to threaten, denigrate or disparage victims of violence or tragedy, deny an atrocity, or configure other types of harassment
  • Content that incites hatred: those that promote discrimination, violence or intolerance, configuring as harassment, bullying or threats. For example, material that foments racial, religious, gender or sexual orientation-based hatred.
  • Manipulated media content: images, audio or video altered to mislead users about verifiable events or facts. Examples include deepfakes and videos edited to distort political or civic realities.
  • Medical content: Google does not allow content that contradicts established scientific or medical consensus. Thus, content that proposes miracle cures, medical information without scientific basis, or that contradicts evidence-based best practices is prohibited.
  • Misleading content : content that misleads users or entices with promises of details not actually delivered. For example, clickbait headlines or content that betrays what is anticipated.
  • Sexually explicit content: content that includes sexually explicit images or videos, the main purpose of which is to provoke sexual arousal, which do not comply with search engine standards.
  • Terrorist content: content that promotes, incites or glorifies terrorist or extremist acts, such as recruiting material or glorifying extremist attacks.
  • Transparency: transparency is central to Google’s standards, and visitors must clearly know who is publishing and managing content. The violation covers cases where there is an absence of information about authors, masthead, publisher, related company or network, and contact information.
  • Violence and heinous content: content that gratuitously incites, glorifies or depicts incidents of violence; in addition, intentionally explicit or shocking content designed to disgust users is not allowed.
  • Vulgar and profane language: content characterized by offensive, obscene or deliberately provocative language without context or informational value, which is considered inappropriate for the platform.

Effects. Content is removed from the News and Discover sections, preserving editorial integrity. It should be clarified that a manual action for Google News or Google Discover does not affect our performance in Google Search, but only produces an impact on site performance in these news sections.

Correction. The path out of this type of penalty is always the same, for all problems:

  • Find and remove any content that, even remotely, may violate News or Discover policies regarding the banned topic.
  • When we have completed the review and made the necessary changes, send a request for reconsideration in Search Console, delving into providing sincere explanations of the issue and, in particular, “evidence of changed editorial practices, including new editorial guidelines and a timeline of improved editorial board practices.”
  • Wait for evaluation: if the Google team determines that the site is no longer in violation they will revoke the manual action.
  1. Abuse of site reputation

This violation occurs when a site hosts third-party content that exploits the reputation of the host domain to manipulate search results, violating Google’s spam rules. In practice, pages are published that have no direct connection to the main purpose of the site, but take advantage of the host site’s ranking indicators to gain greater visibility or strategic advantage. Examples of abuse of the site’s reputation include sponsored or advertising pages published by third parties, content from partners or external sources that are not directly related to the main site, pages created solely to improve the search ranking of other entities, without offering added value to users of the host site.

Effects. Manual enforcement for abuse of site reputation may affect specific domain pages, leading to their degradation in search ranking or complete de-indexing. However, if the violation persists or is repeated Google may take more severe measures and other manual actions, negatively affecting the overall ranking of the entire site.

Correction. The process for resolving this violation and regaining visibility obviously begins with eliminating the triggering problem: we then detect and analyze the third-party pages published on the site, excluding such from the Search index. Once the site is “clean,” we send the reconsideration request, documenting the corrections made and demonstrating commitment to compliance, and await Google’s review.

How to identify manual actions: useful tools and signals

Identifying a manual action is the first step in addressing penalties and initiating an effective recovery process. Fortunately, as mentioned extensively, the tools offered by Google allow us to accurately identify the problem and understand its impact on the site, as well as start the work to recover any ranking and traffic we have lost.

Our main reference is the Google Search Console, the official and irreplaceable tool for detecting the presence of a manual action on a site. To access the Manual Actions report, simply enter the “Security and Manual Actions” section of the main menu. In this specific area, Google informs whether the site has been penalized and provides essential details about the type of violation detected.

Within the report, a number of crucial pieces of information are highlighted:

  • The type of infringement.
  • The extent of the penalty, which may affect only a few specific pages or extend to the entire site.
  • The examples of pages affected, to help better understand the problem and facilitate targeted action.

A key aspect of the report is the use of clear and direct messages, with links to official Google resources to elaborate on each violation. For example, if we receive a notification for “Problematic Structured Data,” the report will provide a link to Google’s guidelines on how to properly implement markups.

To assess the impact of manual actions, we can link Search Console data with analytical tools, analyzing metrics such as organic traffic and declining rankings in strategic keywords. With SEOZoom, for example, we can monitor changes in rankings and identify which pages or keywords have suffered a sudden loss of visibility. Integrating this data with the Manual Actions report gives us a complete overview of the situation.

Addressing the notifications in the report means undertaking a precise and structured resolution process. Once a violation is identified, the first step is to document all corrective actions taken to prepare a robust reconsideration request to Google. This approach not only removes penalties, but also strengthens the search engine’s confidence in the site.

How to approach troubleshooting

As described by analyzing individual cases, dealing with a manual Google penalty requires a methodical and well-structured approach, starting from the precise identification of violations to the adoption of documented corrective measures.

The tools at our disposal, combined with a precise strategy, allow us to analyze the problem, apply the necessary changes, and submit a detailed reconsideration request to have the penalty lifted.

Through the Search Console, we can identify problematic pages and check for any errors found, such as non-conforming links, problematic structured data or duplicate content. The tool also allows us to test the accessibility of pages for Google’s crawlers, checking whether important resources are blocked or inaccessible. It is always good practice to ensure that all pages on the site are visible to both users and Googlebot during the review.

At this juncture, we can also use SEOZoom to further deepen the analysis, particularly to address backlink or content issues. For example:

  • Backlink profile analysis, to monitor links pointing to our site, identifying suspicious or unnatural ones. This is particularly useful for violations related to link patterns, allowing us to identify and reject problematic links.
  • Content quality assessment: by monitoring pages in terms of visibility and keywords involved, we can verify whether the content has features of value to users or meets search intent.
  • Monitoring performance during fixes: once the necessary fixes are implemented, we can observe signs of organic recovery in ranking and search traffic, getting tangible confirmations of the effectiveness of the changes made.

How to submit a reconsideration request

Having solved the detected problem, the next step is to send a reconsideration request via the Search Console’s Manual Actions report. This is a crucial process because it is the only way to communicate directly with Google and demonstrate that you have taken the necessary steps to comply with its guidelines.

Writing an effective reconsideration request requires transparency, accuracy and clear documentation of the actions taken. It is not enough to simply state that you have fixed the problem: you need to provide comprehensive details that show your commitment to improving the quality of your site.

A good request should include several key points:

  1. Full description of the violation: explain the type of problem detected and admit any mistakes made. Honest and open communication shows Google that we understand the issue.
  2. Actions taken to resolve the problem: we describe step by step the actions applied. For example, if the penalty was related to non-natural links, we will indicate that we removed or “rejected” the problematic links and implemented safe practices for the future. If the problem was related to low-value content, we will specify which pages were removed or improved.
  3. Supporting documentation: we will provide concrete evidence of the work done. This may include:
  • Screenshots of the changes made.
  • Updated, testable URLs demonstrating the actual intervention.
  • Lists of links in disavow or reports detailing structural changes.
  • Confirmations of passed tests using tools such as the Search Console URL Checker or Debug robots.txt file.

When filling out the request form, the tone should be balanced and professional. It is helpful to be comprehensive without being long-winded, clearly highlighting our commitment to meeting Google’s guidelines. In addition, we need to focus on the progress achieved, without unnecessary justifications or attempts to minimize the problem.

Once the request is submitted, Google will proceed to review the site. Response times may vary, but generally feedback is received within a period ranging from a few days to several weeks. The result, communicated in the Search Console, can be positive, meaning that the manual action has been lifted, or negative, if Google feels that the corrections were not sufficient.

In the case of rejection, it is crucial to reevaluate the actions performed, correct any deficiencies, and submit a new request with more attention to detail. Each reconsideration process is an opportunity to demonstrate reliability and improve SEO performance over the long term, building trust between the site and Google.

Manual actions and Google Search Console, focus on useful tools

Having broadly clarified the picture of the possible violations we can run into and the practicalities of identifying and correcting problems, we can delve into the techniques for cleaning up the site and attempting to recover traffic and lost positions, seen directly from Google’s perspective. Guiding us in these operations is in fact one of the appointments with the Google Search Console Training series, in which Daniel Waisberg sheds light on the specific Manual Actions Report present in Google’s Search Console, which serves precisely to give us useful indications in case our site is affected by a problem of manual actions that could affect its performance or even its very presence in SERPs, as the official guidance page also explains .

“Google is constantly working to improve Search,” Waisberg says at the beginning of the video, which is why changes to the search engine’s algorithms undergo a detailed quality assessment before official release. The algorithms are excellent at identifying spam and, in most cases, take automatic action to remove it from result pages.

To increasingly refine the quality of search results, Google scans specific sites that do not comply with policies and guidelines: in these cases, a human can analyze the site and possibly decree manual action. When this happens, a part or the entire site may lose positions in the rankings or even be excluded and not shown in Google’s search results.

Guida di Google sulle azioni manuali

The types of manual action

In the video, the Googler focuses on some of the main problems that generate manual action, explaining what they are and how to solve them to clean up the site and attempt to regain lost visibility, giving us as mentioned an additional perspective on this topic.

It starts with pure spam, “what many webmasters refer to as black hat SEO,” which includes complex techniques such as automatic generation of meaningless content, cloaking, scraping (the illicit use of content from other sites) other shady practices .

Waisberg also reiterates that Google defines low-quality content that offers information with little or no added value for users as “meager” or “thin,” and this is even more true after the August 2022 launch of Helpful Content System, which reinforces the need for a site to provide useful content. Conversely, when a site has a significant amount of low quality or superficial pages that do not provide substantially unique or useful content to users and constitute a violation of policy, its pages are exposed precisely to violation and manual action.

Manual actions for structured data issues are imposed if Google finds that some of the markup on the pages exhibit impermissible techniques, such as markup of content that is not visible to users, markup of irrelevant or misleading content, or other manipulative behavior.

How to use the Manual Actions Report in GSC

Google Search Console provides various tools to find out if manual actions have been issued against the site and view their history, with the ability to read all the details about them.

This gives us clear context about the site’s problems and history, which is also useful in cases of recent domain acquisition or new consulting. In particular, here are two reports that can help us understand whether our site has problems: the Manual Actions report and the Security Issues report, which play distinct but complementary roles in monitoring the health of our website. Therefore, they are the reference point to consult immediately when we suspect our site has received a manual action (along with the notifications and alerts section).

Registrazione
Avoid any error!
Analyze your site and check for pages with problems that need to be fixed

In short, the Manual Actions report specifically informs us if Google has taken manual action against our site. This can include information about specific problems, such as the use of unethical SEO techniques. The Manual Actions report focuses on problems that may affect our site’s ranking in Google search results, listing manually detected instances on a page or site that usually refer to ranking manipulation attempts, but are not necessarily dangerous to users. The effect of manual actions, we have said, is a decrease in page or site ranking or even its omission from search results, but without users having any obvious information about it.

On the other hand, the Security Issues report deals with issues that can harm users visiting our site, such as phishing attacks, malware or unwanted software that could compromise the security of the user’s computer. Unlike the problems detected in the other tool, these may result in a visible warning to the user, either in search results or during the site visit, so as to provide a quick warning. Usually, this is a warning label in the search results or a browser might display an interstitial warning page when a user tries to visit them.

How to correct manual actions and site problems

For the purposes of our guide, we focus on the Manual Actions report, which allows us to take action to correct reported issues and thus attempt to recover lost Google rankings and traffic.

Clicking on any of the items in the report opens a summary screen that describes the problem, provides patterns of the affected pages, and indicates a method of resolution.

It is good to understand that we need to clean up all the pages affected by the problem and not take partial action, because otherwise the reconsideration process will fail.

When we have completed the review of all pages with problems reported by Google, we can click on the “Request Review” button in the report to initiate a reconsideration request.

Submit a reconsideration request to Google

A request must describe the corrections made and meet three criteria in particular to be effective:

  • Accurately explains the site quality problem.
  • Describes the procedure performed to fix the problem.
  • It documents the result of the countermeasures taken.

After submitting a request, we will receive a notification indicating that Google has taken over the review; when the review is complete, another message will inform us of the outcome of the process, i.e., whether the reconsideration was accepted or rejected.

Acquisitions of old flagged domains, the procedure to follow

Waisberg also offers some advice to those who have recently acquired a domain affected by manual actions: in addition to performing all the necessary and claimed cleanup operations , the new owner can point out in the request his or her situation and guarantee that from then on the site will follow Google’s guidelines. One should not only remove all old and problematic content, but also add new good content before making the request.

Frequently asked questions about Google’s manual actions

Google’s manual actions often raise questions as well as concerns, particularly about the most effective methods of handling them and what the long-term implications are for a penalized site. In conclusion, therefore, we try to answer the most common questions, offering clear and concise answers to help understand how to deal with and prevent these penalties.

  1. How long does it take to remove a penalty?

The time it takes to remove a penalty depends mainly on two factors: the speed with which corrections are applied and Google’s review time. Once the reconsideration request is submitted through Google Search Console, the process can take a few days to several weeks, depending on the complexity of the issue and the volume of requests Google is handling. It is crucial to ensure that all reported issues are fully resolved before submitting the request to avoid delays or a rejection.

  1. How to avoid sudden penalties?

Prevention is always the best approach. To avoid manual actions, it is essential to comply with Search Essentials and adopt a transparent SEO strategy. You must create quality content, avoid manipulative practices such as backlink buying or cloaking, and use structured data accurately and in accordance with the guidelines. In addition, regularly monitoring the site using Google Search Console and SEO tools such as SEOZoom helps to detect early signs of risk, such as unnatural links or structured data problems.

  1. Is it possible to recover ranking after a penalty?

Yes, it is possible to recover ranking, but the process depends on the severity of the penalty and the actions taken. Once Google lifts the penalty, organic traffic may not immediately return to previous levels, especially if the penalty has had a prolonged impact on the site’s reputation. In these cases, work must be done to rebuild trust in Google’s eyes by constantly improving content and adopting forward-looking SEO strategies. Recovery could take several weeks or months, depending on the situation.

  1. What does a manual action entail for advanced search features?

If the site has been penalized for structured data-related violations , it will most likely temporarily lose access to rich snippets or other advanced features in SERPs (such as review stars or product prices). However, once the breach is resolved and Google removes the manual action, these features can be gradually restored, provided that the site markup complies with the guidelines.

  1. What are the early signs of manual action risk?

Certain signs may indicate that our site is at risk of being penalized. These include a sudden increase in unnatural links pointing to our site, the accumulation of automatically generated or low-quality content, and the implementation of non-compliant SEO techniques. Monitoring Search Console data for warning messages or technical errors, as well as ranking trends via SEOZoom, can help us take action before Google applies a penalty.

  1. What happens if Google rejects the reconsideration request?

If the reconsideration request is rejected, Google usually gives indications as to which problems have not been properly resolved. In such cases, it is necessary to re-analyze the Manual Actions report in Search Console, identify gaps in the corrections made, and proceed with further action. A new request can be sent only after all required changes have been completed, taking care to document them clearly.

  1. Can a manual action affect only a part of the site?

Not all penalties affect the domain in its entirety: some manual actions affect only specific problematic pages or sections, such as a directory containing duplicate content or unnatural links. This provides an opportunity to take targeted action without having to rebuild the entire site. However, it is important to check all pages to ensure that the same problem does not recur elsewhere.

  1. Do manual penalizations have a permanent impact?

No, manual penalties are not permanent-they can be removed once the issues that caused them are resolved. However, their effects can linger over time, especially in terms of loss of visibility and reputation. Even after the removal of the penalty, it may take time to regain SERP positions and Google’s trust.

  1. What tools are essential to prevent and address penalties?

The Google Search Console is the primary reference for monitoring any manual actions and managing direct communication with Google regarding penalties and reconsideration requests. For more in-depth monitoring, SEOZoom’s SEO tools allow you to monitor backlink dynamics, identify critical content and analyze ranking trends, providing useful data that complements that offered by the Search Console.

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP