January 25 Google Search News: AI, core update and SEO
New year, but some things don’t change (luckily or unluckily?) and SEO certainly never stops. And neither does Google, as we are reminded by the first episode of 2025 Search News, in which John Mueller welcomes us with his usual mix of humor and professionalism, taking us straight to the heart of the latest news from the world of Google Search. Between relevant updates on the Search Console, reflections on the current state of SEO and advances in artificial intelligence-related technologies, the topics covered range widely, offering interesting insights for those who want to stay competitive in the new year. In short, it’s shaping up to be a (another) dense and busy year for those working with SEO and for those who want to decipher the trends that will mark the coming months!
Highlights from the January 2025 Google Search News
To usher in the new year, the first installment of Google Search News of 2025 gives us an overview of the changes and updates that are already marking the evolution of Google Search, focusing on several areas of interest to those working with SEO.
Highlights covered by John Mueller include updates to Search Console functionality , with fresher hourly data and extended notifications in the Performance Report, technical interventions on crawling systems, important changes for structured signals in the SERP, and a recap on the two core updates that have affected the volatility of result ranking. The episode also delves into the current state of SEO, thanks to data collected in the Web Almanac that opens up interesting thoughts, and features such as faceted navigation.
There is no shortage of hints about what’s new in artificial intelligence, where Google is pushing further and further with innovative and ambitious tools, nor suggestions with useful resources from the SEO community to improve dashboards or optimize business strategies.
Core update and other news from Google Search
The first focus of the Search Advocate is the release of two core updates that marked the end of 2024, surprising experts and specialists by the timing of the interventions.
Between the November 2024 Core Update and the December 2024 Core Update, in fact, precisely a few days passed – to be precise, between the end of the first and the start of the second there was an interval of only one week! – and both updates affected “several components of the search engine,” causing major impacts on SERPs, as we pointed out in our analysis on the December 2024 Core Update. As always, Mueller does not provide specific details about the changes, but stresses the importance of monitoring site performance in light of these changes.
In parallel, Google has also definitely implemented a new anti-spam update – Site Reputation Abuse – designed to counter manipulative behavior, introducing more details in the official documentation. The new guidance specifically addresses publisher misconduct and explains more clearly what practices may be considered abusive in the management of a site.
For those who manage online content, carefully following these guidelines is crucial to avoid manual actions. Mueller urges everyone to consult the updated documentation to better understand the changes and to keep in line with the rules set forth by Google, but also to follow developments through the Search Status Dashboard, where all updates are tracked transparently-or, we would add, in our Google Update section !
Since these updates can affect both organic traffic and rankings, it is crucial to monitor your site’s progress carefully, using reliable analytical tools to detect significant changes and intervene promptly if issues arise. What’s more, the start of a new year is always a good time to reevaluate the quality of your content and check your site’s compliance with SEO best practices.
Goodbye to the sitelinks search box
Another major change concerns the final removal of support for the sitelinks search box-an advanced feature of SERPs that allowed users to perform an internal search directly from a page result. At the same time, Google also stopped using structured data related to this feature.
Although this change may seem significant, Mueller reassures publishers that they do not need to take action on their sites to remove the associated markup. Structured data will continue to remain on pages, but without triggering specific functionality in SERPs. This decision is a step toward simplifying the visual and functional impact of SERPs, but it also highlights how it is increasingly important to focus on valuable content and intuitive site architecture to aid navigation.
What’s new for the Search Console
Among the most important updates covered in the episode are the new features introduced for Google Search Console, which aim to further simplify the monitoring and management of websites through improvements that are increasingly focused on usability and timeliness of information. From the new Performance Report with hourly data to the extension of the recommendations system, the changes are designed to give site managers even more targeted and useful tools to work effectively.
Perhaps the most important intervention is the introduction of hourly data in the Performance Report, a feature that allows monitoring of organic traffic trends over the last 24 hours only . This temporal granularity is an important step forward for those who manage sites subject to sudden spikes in traffic, such as eCommerce during peak season or sites that publish news.
With this innovation, it becomes possible to quickly detect abnormal fluctuations in user behavior or site performance, enabling timely interventions. The almost “live” nature of the data addresses a need felt by many SEO professionals, who can now act with greater precision in understanding the immediate impact of events, campaigns or updates to content.
Evolving Recommendations in Search Console
In the video, John Mueller also announces the extension of the previously introduced recommendations system, which will now be available to all eligible sites. These notifications help site managers detect and resolve recurring issues in a preventive way.
Other new features reported include a check for homepage indexing status , which requires domain verification at the property level, and a notification system that identifies crawl budget problems in case of site crawling difficulties. The mechanism is set to trigger only when at least 5 percent of the site has crawling-related errors, signaling situations that could be temporary, such as server-related disruptions.
This development results in a concrete help to monitor the technical health of the site without having to perform an in-depth manual analysis, proving particularly useful for those who do not have constant and structured monitoring.
Restyling of Search Console emails.
As part of an overall improvement effort, Google is also updating the design and content of emails sent through Search Console. As Mueller explained, the goal is to make messages more modern and uniform, not only in graphics but also in language.
Notable among the minor changes-which, however, also tell the story of the evolution of the industry-is the final abandonment of the term “webmaster,” now considered obsolete, in favor of more current and professional vocabulary. Although these changes do not directly impact sites, they do help improve the interaction experience with the tool, offering clearer and more relevant communications for those involved in SEO and web content management.
Web Almanac: a look at SEO in 2025
Among the most thought-provoking points featured in this episode of Google Search News is the focus on the chapter devoted to SEO in Web Almanac, a detailed analysis that offers valuable insights into what the current state of SEO looks like in the global landscape. The Web Almanac, compiled by industry experts and Google engineers, is powered by data from the HTTP Archive, a vast repository that collects information from millions of public websites. The large amount of data analyzed makes it possible to draw a comprehensive and articulate picture of the technical and strategic elements that characterize search engine optimization.
Among the many statistics reported, an interesting fact emerges: almost 84% of the sites analyzed have a robots.txt file, an essential tool for defining the rules of crawler access to one’s site. This figure confirms how crucial crawling management is considered by a large majority of websites, especially to improve loading times and optimize crawl budget consumption. However, it should be noted that robots.txt files are not always configured properly, suggesting that there is still room to educate site owners on how to make the most of this strategic element.
The Web Almanac analysis goes further, exploring multiple aspects affecting contemporary technical SEO , including the adoption and implementation of schema markup, the spread of HTTPS protocols, and site architecture. For those involved in SEO , diving into the reported data can be an opportunity to compare their strategies with global trends and identify numbers and metrics to work on to improve their project performance.
Mueller urges SEO professionals to use these analyses as a starting point for more focused insights, the Web Almanac being a useful and thought-provoking map of the state of the industry, even to guide next decisions in an increasingly technical and competitive landscape.
Crawling: clarifications in technical documentation
Technical SEO is one of the pillars of a successful digital project, and this installment of Google Search News could not miss an in-depth look at crawling. In fact, Mueller reports the publication of a series of technical posts made by the Googlebot team , which delve into the processes and dynamics that govern the behavior of Google’s crawler. These are resources that go well beyond the basic introduction and offer a detailed view of several often overlooked, but fundamental, aspects of crawling.
One of the main topics explored concerns the role of HTTP caching. This technology allows Googlebot to reduce the load on the site’s servers by caching the responses of already visited pages. Thanks to this approach, the resources requested by crawlers can be managed more efficiently, avoiding excessive consumption of the crawl budget. This detail is particularly relevant for those managing large sites, where the frequency of crawls directly affects overall performance.
Another point addressed in the posts concerns best practices for managing faceted navigation, a common feature on e-commerce sites or sites with a complex structure. Faceted navigation allows users to explore site content through filters and combinations (e.g., by color, price, or category), but it can easily generate crawling and indexing problems. Google confirms that these configurations require specific optimizations, such as using canonical parameters or blocking non-significant URLs in the robots.txt, to avoid wasting resources and creating duplicate content.
These suggestions not only describe problems, but also offer practical solutions and approaches to take when in doubt about particular technical scenarios. Mueller adds that the documentation is designed as a reference not only to solve occasional issues or edge cases, but also to improve one’s understanding of Googlebot’s advanced mechanisms.
Artificial intelligence: Gemini, Mariner and beyond
Still on the subject of can’t-miss topics, the Google Search News episode also hints at two generative AI tools recently launched by Google, Gemini and Mariner for Chrome, which promise to expand possibilities both in terms of individual productivity and advanced search.
The novelty of the Gemini project lies in its Deep Research mode, a mode that aims to significantly improve the way searches are conducted. With this technology, concepts and information can be explored in a deep and structured way, offering support beyond what traditional search engines are capable of. It is an example of artificial intelligence designed to accelerate learning and decision-making processes, ensuring useful results even for those managing complex content.
Mariner for Chrome, on the other hand, shifts the focus to productivity by introducing an advanced control system for the browser. Although Mueller did not go into details, this tool seems intended to simplify web interaction through automations that reduce the time needed to complete recurring tasks or manage content. Mariner’s capabilities are still in an early stage and are not available everywhere, but they represent a significant step forward in the way you can interact with your browser through AI.
Insight from the SEO community: machine learning and advanced dashboards
As always, Google Search News also provides space for the SEO community, highlighting valuable contributions shared by industry professionals. In this installment, Mueller mentions three resources that can expand expertise and open new horizons for those working with SEO.
The first contribution concerns an article by Lazarina Stoy dedicated to the application of machine learning for SEO tasks. This complex but well-structured study is also designed for those with no particular coding experience, proposing practical solutions to automate and improve specific tasks related to optimization or data analysis.
The second resource mentioned explores the potential of Looker Studio, with an in-depth discussion edited by Daria Chetvertak showing how to use these dashboards to visualize and analyze information from Search Console, Google Analytics and other data platforms. The precise and easily interpretable visualizations offered by Looker Studio are a great tool for monitoring a site’s performance, identifying trends and communicating results to stakeholders.
Finally, Mueller cited a strategic study conducted by Dan Taylor, focused on how to gain consensus and support for SEO decisions from senior managers within large companies. Managing complex business dynamics has always been a challenge for SEO professionals, and the study offers suggestions for presenting proposals in a compelling way and aligning SEO work with business goals.
These three contributions, while diverse, show how the community continues to be a source of inspiration and an engine for innovation. For those who wish to delve further, they represent valuable tools for enriching their approach and facing the daily challenges of SEO work with greater awareness.