It is not always Google’s fault: sometimes the performance of our site on the search engine is damaged by technical problems or SEO errors that do not allow it to express its true potential and reach the desired positions. A study reveals what are the nine most common errors on sites that emerge from the analyses carried out on the field and what are the strategies to be implemented in order to correct them.
Site errors, here are the most common 9
Only a few weeks ago we told about the Techtarget case to show how and how much the problems of technical SEO actually affect ranking and organic traffic, and on other occasions we focused on the strategic importance of checking the site with an SEO Audit and understanding what are the possible critical areas. The new research by Zazzle Media, published on Search Engine Watch, allows us to learn more about these difficulties and to have some directions to recover.
- Broken internal Links
- Meta titles length
- Redirecting internal links
- Outdated sitemaps
- Orphan URLs
- Site speed
- Hierarchy/Structure
- Internal linking
- Thin contents
Analysis on thousands of sites
For this job, the UK content marketing agency has carried out thousands of audits for sites of various sectors and sizes, finding some distinctive problems that are repeated over and over again: for instance, some CMS platforms have their falls and repeatedly cause the same technical problems, but most of the time the criticalities are caused by sites run by multiple people, knowledge gaps or simply the time factor.
1. Broken internal links
One of the simplest problems, but which can also be easily overlooked if you do not perform a specific check, is the one about broken internal links, broken connections that can interrupt the user’s path and prevent crawlers from linking content. This error can negatively affect the authority of the page and disrupt the equity flow of the link.
To find out if there are internal links on the site that do not work, just do a scan using an analysis tool (such as our SEO Spider!), which returns all cases of error.
2. Meta titles and description length
If Google has dedicated a video guide to the optimization of preview snippets, means that titles and meta descriptions are really important! In general, the most frequent errors for these fields concern the management of the length of the content (also because the analyses cannot push to investigate the actual offered quality), and in the worst cases they can negatively determine the fate of a business.
When too short, the meta titles can be a spy of a lack of targeting, while on the contrary when they are too long they can be cut in SERP and therefore be ineffective: in both situations, the risk is not to get the click of the user and then have a low CTR.
3. Redirecting internal links
Redirection of internal links can cause problems to the site architecture, as users and search engines take a little extra time to find contents.
With the modification of the contents or the exhaustion of products it is usually used a permanent redirect (301) or a temporary one (302: here our guide to HTTP status codes useful to know): the 302 indicates to a search engine to maintain the old page, since it is a temporary measure, The 301 indicates that the page has been moved permanently and will be replaced in the new position.
Redirection loops occur when our browser tells the search engine to redirect to a page, which in turn tells our browsers to redirect to another page and so on until you reach your final destination. Redirection loops should be avoided at all costs, as they increase scanning time and can send mixed signals to research robots.
The problem does not lie in the redirect per se, if completed correctly, but in the links that point to the redirection of the URL: for example, the URL “one” redirects to a new URL “two”, but the URL “three” still points toward URL “one”. In this case, as well, a scan allows us to find out all the critical cases, on which we can intervene via CMS by modifying the href destination so that it points to the new correct URL.
4. Outdated sitemaps
XML sitemaps do not need to be static and it is instead recommended to use a dynamic sitemap.xml, to be sure that the CMS automatically updates this file directory whenever we add a content or media resource.
It is however good to pay attention to the use of dynamic sitemaps, because you risk adding unwanted Urls.
5. Orphan URLs
Orphan pages, also known as “mobile pages“, are indexed and published Urls that cannot be found by following internal links from users or search engines (and therefore potentially never be scanned). A typical scenario of an orphan page could be a seasonal sale: once the page was needed, but after the season change has become obsolete.
Basically, experts say, the presence of a few orphaned pages is not harmful, but if they increase they can inflate the site, causing a poor distribution of link equity, cannibalization of keywords and a poor experience of internal link paths for both the search bot and the user.
6. Site speed
By now we should be aware of the crucial role of speed on the performance of the site, which with the Google Speed Update has officially become a ranking factor. The site’s speed is closely related to the good user experience and slow websites have high bounce rates due to prolonged content loading (and potentially worse returns).
7. Hierarchy and architecture of the site
The hierarchical structure of the website, also known as information architecture, is essentially the way in which site navigation is presented to a search engine or user.
The fundamental problem affecting most websites is the distribution of the ranking of pages, according to the study, because the main pages of websites or the most profitable pages should be no more than three clicks away from the home page.
Without an effective hierarchy, the crawl budget can be wasted and pages in the depths of the site could have mediocre positioning, since Google is not sure of the importance of the page and the links equity could be distributed in a lighter way.
8. Internal Linking Management
Internal links are an important feature of a site because they allow users to navigate within the pages and, from an SEO point of view, allow search engine crawlers to understand the connections between contents.
An effective internal linking strategy could have a great impact on rankings, but often – especially on complicated sites – we notice quite messy situations. For instance, anchor texts that does not contain a keyword, the inconsistencies of linking Urls per volume (for the purpose of Pagerank distribution) and links that do not always point to the canonical version of a URL: all this can create mixed signals for crawlers and eventually confuse them at the time of content indexing.
9. Low quality contents
One of the pillars of the SEO is to offer quality, unique and useful content for users: a work that requires time and consistency and that, when not performed correctly, creates what we call thin contents. These pages are of poor quality, which do not allow to understand the business services and offers of products and are also explicitly contrary to the guidelines of Google, which may result – in the worst case – in a penalty scenario.
The length of the text is a first parameter to understand if the contents are thin, but we know that in reality the number of words is only a quantitative indicator and is linked to other factors. However, by focusing in the first place on the shorter pages can be a way to improve the quality of the texts provided.