Searches change, users change, but the result is still the same: the positions on Google are “default,” with three or at most four sites sharing the vast majority of the search volume pie and all the others only able to compete for the residual crumbs. This is not just a “feeling,” but what is actually happening on the search engine and what we have also detected with SEOZoom by analyzing data from millions of Google searches and related site performance. What emerges in particular is the weight of branding and trust that the search engine places on the pages of those brands.
Always the same sites on Google? SEOZoom’s findings
As told to ADNKronos, the survey started from Ivano Di Biasi’s observations on recent Google SERP trends, which are really simplistic in choosing who should be positioned and in which position.
In practice, it’s easy to always notice two or three sites, four at most in some areas, taking most of the traffic and staying in the top positions, while everyone else shares the crumbs.
Said in more technical terms and with data in hand, SEOZoom’s Traffic Share metric reveals to us that the top player sites chosen by Google for each industry capture as much as 70 or 80 percent of all search traffic, and thus all other sites struggle only for really residual and minimal shares of clicks.
There is another critical aspect: it seems that Google has already predefined who should be at the top for anything, because in each vertical niche there are always the same sites monopolizing the top positions of visibility on the search engine.
The analysis in the travel and food verticals: traffic always to the same sites
The screens below highlight the situation.
In the travel vertical, queries for hotels in Italy always see Booking and Tripadvisor emerge, which alone group more than 50 percent of the search volume.
This applies to “Hotel a Milano” (hotels in Milan)
“Hotel a Napoli” (Hotels in Naples)
“Hotel a Roma” (Hotels in Rome), for instance.
The pattern is also very similar in the cooking sector, where all searches see three major giants emerge. Whatever the keywords, in fact, Google always prioritizes GialloZafferano, Cucchiaio and Fatto in casa da Benedetta, which, moreover, also manage to get traffic through the placements of their social channels.
The screen above summarizes the scenario for the query saffron risotto. GialloZafferano captures 52.5 percent of the total, alone; Spoon is second with 17 percent and Fattoincasadabenedetta third at about 9 percent. Together they make 78 percent of all clicks from Google for this search intent-and then all the other sites fight for just 22 percent of the traffic.
As you can see, the average level of optimization is similar – this is not the “discriminating factor” for gaining visibility.
Let’s change queries and analyze “torta di mele” (Apple pie).
The traffic share is very similar to the previous one. In this case, GialloZafferano takes 42%, Spoon 17 and Benedetta Rossi’s site 15%: again, together they achieve almost three-quarters of the traffic from Google (about 75% of the total), with very little room for competitors.
And even here, average optimization relatively influences ranking.
As per Traffic Share’s feature, the analysis is not limited only to the single keyword, but on the broader data of the overall search volume of all keywords that relate to the same search intent, the need that drives people to search on Google and that the search engine translates into relevant results.
The impact of social channels
Beyond the clear predominance of the share of organic traffic gained, another interesting element concerns the relevance of social (at least in the food sphere): in the queries examined there are often organic results drawn precisely from social platforms, usually from the official profiles of the “top3” in the sector, as the detail of the Social Opportunities tool also shows us.
GialloZafferano also manages to position itself extensively for each of its recipes with its social channels. In particular, its social channels are practically always in the top10 for recipe-related queries.
Benedetta Rossi also benefits from social profiles, although its placements are more related to entity and brand rather than individual recipes-an element that could be linked to Benedetta Rossi’s “extra-social” and extra-Web fame.
This strong prevalence of extra-site channels means that any “action” performed by these two big brands shakes up the social graph and social signals a lot: when they post something, even on social, they “echo,” they shake things up on the Web as well, even at the link level.
Analyzing one of the strongest competitors, Cookist, we see instead a significantly lower number of social placements (ten times less!) indicating the reduced authority this domain and brand has for Google.
Google focuses on Trust and EEAT
According to Ivano Di Biasi, the core of this dynamic is in the concept of EEAT, which we know stands for Experience, Expertise, Authoritativeness, Trustworthiness. Basically, it is the global set of factors and signals that measure how much Google trusts and considers a website to be trustworthy in order to rank it in its results.
The key word is “trust,” which is closely related to the authority acquired by domains and perceived by the search engine.
Our analysis seems to suggest that Google decides almost beforehand how much traffic should go to each website – and of course to which website – dividing up the entire pie of its traffic very disproportionately and leaving only crumbs for sites it considers less trustworthy and authoritative than its “top” sites. One only has to look at the distribution of traffic with Traffic Share, which then becomes a valid indicator for viewing the authoritativeness of a domain for each topic.
It is as if, even in order to be able to keep up with the current speed of technological innovations and quickly determine the veracity of content, Google has simplified its choices: the ranking (and distribution) is as predetermined because the search engine seems to have essentially decided to go for almost blindfolded focus on those who always offer reliable content.
This is somewhat what happens in our physical world, where anyone can express themselves, but the reliability, truthfulness and success of a piece of information depends on the source that provides it.
So much for volatility in SERPs!
Ivano’s comment is very stark: “It is as if Google (company) has abandoned Google (search engine).” Chosen the top players for each industry “they will always come out first, all the others share the crumbs in rotation.”
In fact, our study also reveals that the organic ranking landscape is split: on the one hand we have the “untouchables,” but on the other we see a rotation dynamic that affects the visibility of sites that are not at the top of the hierarchy.
Delving deeper into the survey, in fact, we note that the ranking of sites that dominate the top positions in search results is remarkably stable and static. These sites, rated by Google as the most relevant and authoritative in a given field, “consistently maintain their positions at the top of the SERP and do not appear to change significantly,” even following the many dreaded algorithmic updates.
Quite different is the situation below: sites that do not occupy the top positions are in fact in a sort of limbo, placed in a cycle that we can call sinusoidal. In this cycle, the keywords associated with these sites experience fluctuations in terms of visits, with times when they generate a high number of visits and others when they seem to perform less well. This is not necessarily the result of SEO errors or poor strategic choices, but rather a deliberate choice on Google’s part.
In fact, the search engine seems to operate a kind of rotation among sites that are not at the top, showing in SERPs first some sites and then others, alternating them. In this way, Google manages to give visibility even to sites it considers less authoritative or important, allowing them to gain exposure and traffic. Di Biasi compares this system to that of advertising ADS, “where the remaining traffic, that 20-30% that does not go to the sites at the top of the SERP, is distributed among the sites positioned lower down, each with its own rotation priority.”
Traffic share analysis, SEO implications
What can we infer about SEO?
First of all, as we said in the past, that we should forget about the single keyword and focus on the user’s entire Search Intention.
The best web page on a given topic is not necessarily the one that ranks first for the main keyword, and the mechanism is not so simple. Other pages, in fact, may have geared content to meet every user need, trying to cover the entire traffic related to the topic, not just that related to the single keyword.
This is why it is useful to use SEOZoom, the only tool that identifies the user’s entire search intention on any topic and finds all the keywords and relationships to support writing the perfect content in the way Google likes. In particular, it is crucial to analyze the keyword in depth, not just stopping at search volume or KD/KO, but also finding out thebto see which site is really winning the game and what chance we have to make traffic.
In the case of online shoes, for example, the first-ranked website actually gets 7.6 percent of the entire available traffic, and the winner is the 28th-ranked website in the SERP of the best keyword, i.e., the one that we will simplistically categorize among the losers, while instead centralizing 40 percent of Google traffic.
According to our Ivano Di Biasi, this situation confirms that EEAT has really become the predominant factor regarding Google ranking, particularly in the weight it has on traffic distribution “how much Google trusts the site and how trustworthy this site is.” At the same time, it means rethinking what it means to “be first on Google”: the goal is not to get a top ranking for a single keyword, but to try to become authoritative on the whole topic covered by all related keywords. For example, analysis of “online shoes” reveals to us that it is not the main keyword of intent.
Strategies for gaining visibility
But what can sites do to try to win a few more crumbs and improve their ranking?
Ivano suggests three key strategies:
- Focus on the set of keywords related to the overall search intent, rather than individual keywords. This means thoroughly understanding what users are looking for and what their intentions are, and then creating content that comprehensively addresses those needs, thereby trying to compete for a bigger slice of the traffic pie.
- Strengthen the brand, including through social media campaigns. A strong and recognizable brand helps build trust and authority, two factors that Google considers important in assessing a site’s quality and trustworthiness.
- Work on links, including internal links, to build a strong network that supports the site’s authority. Links are like votes of confidence that help establish a site’s reputation in Google’s eyes, and internal links are also pivots on which a site’s strength is built.
By following these pointers, it is possible to increase the perceived authority of a website and improve its ranking in search results, thus breaking out of the cycle of limited visibility and gaining a more stable position on Google.
The results of optimization: positions gained only with internal links
It is again our CEO who recounts the outcome of a recent optimization test for a cooking site. Ivano worked on a small number of keywords, analyzed in depth, and tried to strengthen the brand’s authority on the topic by going to improve internal references to the same site.
That is to say: he built a stronger network of internal links, in some cases even going to force with the precise anchor text.
This operation yielded immediate results: in just a few days, the site has already seen increased rankings for the entire keyword cluster of search intent.
Admittedly, it did not undermine the dominance of the top3 in the industry, but it did shake up some of the previous traffic share, a sign that something can still be done.