SearchGPT: what it is and how SEO is changing

Put us to the test
Put us to the test!
Analyze your site
Select the database

The news was in the air, and our Ivano, for example, talked about it in his “SEO for AI” outlining the SEO of the future, but nonetheless it had a very strong echo, causing a mix of interest, hopes and concerns. OpenAI finally announced the birth of the first search engine entirely based on artificial intelligence: it is called SearchGPT and is currently only a prototype available in limited access to a sample of users. The new system can retrieve data and information in real time and leverages its language models for artificial intelligence in an attempt to challenge Google’s monopoly in search, which for decades seemed unbreakable. OpenAI seeks to offer an alternative by focusing on a different approach that will inevitably impact tomorrow’s SEO as well!

What is SearchGPT

SearchGPT is an experimental artificial intelligence-based search engine developed by OpenAI that combines the most advanced AI models with real-time access to information available on the web.

Discover the SEO of the future
How long until AI will also replace Google for search? Get ahead of the revolution and start preparing your content!
Registrazione

Based on OpenAI’s advanced artificial intelligence models, such as GPT-3.5 and GPT-4o, the system is designed to offer fast and accurate answers and aims to revolutionize the way we interact with information online. Indeed, it aims to offer fast, accurate, and highly relevant answers to user queries, going beyond the operation of a simple traditional search engine to attempt to make online search a more conversational and intuitive experience.

How SearchGPT works: the AI-powered search engine

The new search engine is for now a prototype limited to a select group of 10,000 users and publishers to gather valuable feedback, as clarified by the official announcement that appeared in late July on the OpenAI website.

In the preview images released, however, we can get a glimpse of the appearance and operation of SearchGPT, which is quite similar to the interface of GPT-4o, known as “navigation mode.”

The user lands on the page where he or she finds a search box that asks, “What are you looking for?” After entering the query, SearchGPT will provide an answer that includes links to relevant sources within the textual answers, with additional results displayed in a sidebar.

This is where there is the big difference from classic search engines: after the answer result appears, the user can ask follow-up questions or click on the sidebar to open other relevant links to deepen and refine their search. There is also a feature called “visual answers,” but no information has come as yet on exactly how it works.

Some searches take the user’s location into account: in a supporting document, OpenAI writes that SearchGPT “collects and shares” general location information with third-party search providers to improve the accuracy of results (for example, showing a list of nearby restaurants or weather forecasts). SearchGPT also allows users to share more precise location information via an option in the settings.

SearchGPT’s distinctive and innovative features.

Unlike conventional search engines, which return a long list of often irrelevant web links, SearchGPT is therefore distinguished by its ability to provide organic and well-organized answers. Using deep learning algorithms and real-time access to web information, SearchGPT is in fact able to understand the context of users’ queries, offering answers that go beyond a simple list of results.

For example, if one searches for “the best restaurants in Rome,” SearchGPT will not only provide a range of options, but also offer detailed descriptions, reviews, and other relevant information, all supported by direct links to sources.

Another distinctive aspect of SearchGPT is its ability to follow the thread of conversation. Users can ask follow-up questions without having to repeat the initial context, making search similar to a conversation with a personal assistant. This level of interaction is made possible through a combination of advanced language models and dynamic access to real-time data. This approach facilitates in-depth analysis and clarity of information, saving time and reducing the need to formulate multiple attempts with different keywords.

A different approach with publishers and sites

Another distinguishing feature is OpenAI’s approach in working with publishers and content creators. According to the release, SearchGPT was developed in close collaboration with several newspapers and content platforms, ensuring that the information highlighted is of high quality and reliable. Publishers also have the ability to manage how their content appears in search results, offering more control and transparency than many other search engines.

OpenAI thus seems to have learned from Google’s mistakes (or at least from the atavistic controversy regarding the relationship between publishers and the Mountain View company) and has taken a markedly different path. The blog post points out that SearchGPT has been developed in collaboration with various news partners, which include organizations such as the owners of The Wall Street Journal, The Associated Press, and Vox Media, the parent company of The Verge, who have provided valuable feedback and will continue to be listened to for new input.

OpenAI is also “launching a way for publishers to manage how they appear in SearchGPT, giving them more options. ”Importantly, SearchGPT is about search and not related to training OpenAI’s generative artificial intelligence models. Sites can appear in search results even if they choose not to participate in generative AI training, they clarify from the company.

In the same way, then, the company has avoided running into the issues that rival Perplexity – accused in June 2024 by Forbes for using pieces of content in its AI summaries without permission or citation, and later by Wired for failing to comply with instructions in the robots.txt files of scanned sites – has instead become entangled in. That’s why SearchGPT is being presented as a more responsible and measured deployment, and it is emphasized that there will be “citations and links prominently” to publishers in searches with “clear, inline and named attribution,” including so that users know where the information came from and can quickly access multiple results in a sidebar with links to sources.

AsNicholas Thompson, CEO of The Atlantic, said, “AI search will become one of the key ways people browse the Internet, and it is crucial in these early days that the technology is built in a way that values, respects and protects journalism and publishers.”

OpenAI’s words: a new way to search online

The article shared on the official website also gives us insight into how OpenAI is promoting SearchGPT, described as “a prototype for new search functions, using the power of our artificial intelligence models to provide quick answers with clear and relevant sources.”

SearchGPT is designed to provide an answer: while getting relevant answers on the web today can take a lot of effort, and multiple attempts often need to be made to get relevant results, enhancing the conversational capabilities of AI models with real-time information from the web “finding what you’re looking for can be faster and easier,” the company further says.

In addition, the user will be able to ask follow-up questions, just as in a conversation with a person, with the shared context being built with each query.

In the future, the article still says, OpenAI plans to integrate the “best” parts of the new search features directly into ChatGPT.

Early tidbits about SearchGPT

This new product has been the subject of rumors for months, with some sites anticipating its development as early as February 2024 and Bloomberg giving more details in May, a month in which some sources also spoke of OpenAI’s attempt to aggressively poach Google employees for a search team.

OpenAI is slowly bringing ChatGPT more in touch with the real-time web: by the time GPT-3.5 was released, the AI model was already months out of date, and in September 2023 OpenAI released a way for ChatGPT to browse the Internet, called Browse with Bing, which seems much more rudimentary than SearchGPT.

OpenAI’s rapid progress has won millions of users for ChatGPT, but the company’s costs are rising. The Information reported this week that OpenAI’s AI training and inference costs for OpenAI could reach $7 billion this year, with millions of users in the free version of ChatGPT further increasing computational costs. SearchGPT will be free during its initial launch, and since the feature currently has no advertising, it is clear that the company will have to find a way to monetize soon, and perhaps placing SEA ads in the search engine could be a solution.

Speaking of curiosities, for the time being the SearchGPT presentation was not free of bugs, as happened later to Google’s infamous Bard demo. As noted by U.S. observers, in fact, in some of its demonstration videos SearchGPT answered a query about [music festivals in Boone, North Carolina, in August], showing a list of festivals that it claimed would be held precisely Boone in the requested month. The first name mentioned is An Appalachian Summer Festival, which according to the tool takes place from July 29 to August 16 this year, but actually began on June 29 and closed on July 27 with a final concert.

Another little tidbit: at the moment, googling [searchgpt] brings up https://searchgpt.net/index.html, which is not owned by OpenAI, among the first results. It is a site promoting an AI-based extension for Chrome and Bing browsers, promising to integrate ChatGPT into Google (without any particular information about the developers, company, or other official references). Instead, typing https://searchgpt.com/ triggers a redirect pointing to the ChatGPT login page (and to be precise, to https://chatgpt.com/auth/login).

Search engines compared: SearchGPT vs. Google

With this in mind, SearchGPT ambitiously positions itself as an alternative to Google and other incumbent search engines, bringing a breath of innovation to the online search industry.

The promise of smoother and more natural interaction, coupled with the accuracy and relevance of answers, marks a potential turning point on how users search for and interact with web information.

The differences between SearchGPT and Google manifest themselves primarily in the approach to search and the presentation of results. Google, the unchallenged ruler of the industry for years, relies on complex algorithms to index and rank billions of web pages, returning users a list of links sorted by relevance. This methodology, however, often implies the need to navigate through many results to find accurate and relevant information. In contrast, SearchGPT significantly reduces this complexity by offering direct and concise answers supported by verified sources.

To put it better: instead of returning a simple list of links, SearchGPT tries to organize them and make sense of them. In one of the examples provided by OpenAI, the search engine summarizes its results on music festivals and then presents brief descriptions of the events, followed by an attribution link.

A key aspect that distinguishes SearchGPT is its ability to support contextual conversations. When users interact with Google, a new query generally means a fresh start, as in the pogo sticking phenomenon that often leads to search redefinition. In contrast, SearchGPT is able to remember the context of previous queries, enabling a continuous and more natural dialogue. This reduces the need to continuously repeat or rephrase queries to obtain relevant answers, improving search efficiency and fluidity.

SearchGPT also incorporates an artificial intelligence-based personalization element that interprets user intentions accurately. With real-time access to information, SearchGPT can provide up-to-date and contextualized answers, an area in which Google has historically demonstrated shortcomings, especially on rapidly evolving topics. In addition, answers provided by SearchGPT come with clear attribution, with direct links to sources, offering transparency and ease of verification, an aspect that can sometimes be less evident in Google search results.

On the collaboration front with publishers, SearchGPT takes a proactive approach to ensure that the information provided is of high quality and respectful of authors’ work. OpenAI has worked closely with various publishers and content creators to ensure that their work is accurately represented and benefits from the visibility generated by searches. Publishers have the ability to manage how their content appears in search results, deciding whether they wish to participate in training artificial intelligence models or simply opt for organic visibility.

Google, while making strides in integrating information from publisher sources through features such as featured snippets, does not offer the same level of direct control to publishers. This sometimes leads to conflicts between the visibility of original content and its use by Google search tools, with accusations that Google sometimes cannibalizes traffic destined for publishers’ websites.

Google’s problem: the status quo and monetization

In short, keeping these premises we are really facing something that could become a significant threat to Google – which is trying to integrate AI features into its search engine, fearing that users will move to competing products.

However, as Ivano also explained in a recent interview, the Mountain View giant is stuck in a delicate position: on the one hand, it has advanced AI technology,and on the other hand, it is locked into a business model that depends heavily on ads and publisher-generated content. This model, based on advertisements (ads) and traditional SERPs, would be disrupted if AI were to be fully integrated into its search functions.

Therefore, it initially experimented with integrating AI into its SERPs with Search Generative Experience (SGE), later called AI Overviews, but this technology, however advanced, has not been fully implemented. The reason is that doing so would undermine its core business: if Google started providing complete answers directly through AI, it would drastically reduce the need to click on sponsored links or visit publisher websites, effectively eliminating the main source of its advertising revenue.

Thus, the challenge for Google – on which its very future as a search engine also depends – lies in finding a business model that effectively integrates AI without sacrificing advertising revenue. If Google succeeds in this endeavor, it can continue to dominate the online search landscape. If, on the other hand, it fails to find a suitable solution, it may leave room for new players that are not constrained by the same economic and structural problems. These new players, thanks to their AI technologies, could become the search engines of the future, completely redefining the dynamics of the industry.

Google and SearchGPT: Giuseppe Liguori’s explanation

Also providing more details was our own Giuseppe Liguori, CTO and co-founder of SEOZoom, who clearly and pertinently explained how current search engines work and what potential revolutions SearchGPT may bring.

Currently, search engines like Google work like this:

It all starts with spiders, systems that scan the web, find web pages and place them within an index; on this index, various calculations, known as ranking factors, are applied that determine which results will appear when a user types a query on the search bar. The algorithm established by Google alone takes into account thousands of factors, often related to business logic, and these factors are subject to frequent change-specific algorithmic updates and more general broad core updates. This constant updating makes life complicated for SEO professionals, forcing them to stay constantly updated to avoid being penalized.

It is in this scenario that the new approach applicable by SearchGPT fits. Instead of using a deterministic algorithm, SearchGPT could leverage its own proprietary spider to scan the web and enter the information into a so-called vector database. As Joseph describes, this database is a geometric representation of information, where text is transformed into numerical vectors that populate a geometric space, a kind of graph with thousands of dimensions. When a user formulates a query, the search engine extracts the pieces of text that are geometrically and semantically close to the query, offering answers based on relevance and semantic similarity rather than predefined rules.

So there is no deterministic algorithm where it is the engine itself that sets the rules (and then changes them): it is all based on the relevance and similarity between the question we asked and the content that is in this giant dot plot.

Differences between SearchGPT and existing AI search engines

But OpenAI’s new engine differs not only from classic search engines, but also from more advanced prototypes that already rely on artificial intelligence, such as Google’s AI Overview, Perplexity, and Bing.

Under the surface, in fact, these competitors still use traditional search mechanisms and proprietary algorithms, often limiting themselves to summarizing the top ten results from conventional search engines – Joseph calls them meta-engines that rely on external APIs. In contrast, SearchGPT could and should be positioned as a solution that responds directly to user queries, selecting the most suitable content based on the written query. This approach avoids the intermediation of classic search results, focusing solely on relevance, and truly provides a service to users.

If we think about it, in fact, says Giuseppe again, people “want answers to their needs, they don’t want to browse the web because they get annoyed and have little time.”

For its part, Google would have the ability to adopt an AI-based search model like SearchGPT’s, but as mentioned earlier it would have to give up or jeopardize its current (and well-paying) business model, which is based on advertising, banners and sponsored results. Alphabet-the parent company of Google-has everything to lose, while OpenAI is now entering the market and has more room to take risks and adopt radical innovations without the same concerns related to immediate monetization.

How far away is this search engine from general deployment?

Liguori, however, also points out other crucial aspects that will determine the success or failure of the SearchGPT project.

First, there is the immediate issue affecting content creators and their (eventual) remuneration: if the user will have the answer to his or her question right away, without the need to explore other web pages, what will be the incentive to convince content creators to write new content and produce new materials?

The question is twofold: new content is needed not only to populate the web with up-to-date and relevant information, but also to continue the training of Artificial Intelligences themselves. And thus, it will be necessary to develop new remuneration mechanisms to incentivize the production of quality content, preventing the web from becoming an information desert dominated only by artificial intelligence, where there will only be AIs talking to each other repeating (and providing as answers) only old information.

Then there is the time factor to consider: how long will it take for people to start adopting this new conversational system to search for things? How quickly a system like SearchGPT spreads will really depend on the agreements and integrations that OpenAI is able to make: if, for example, it were to be integrated with Siri on Apple devices, the mass of users would quickly become accustomed to this new way of searching, reducing their use of Google.

But there is also a question regarding the quality of the answers SearchGPT offers, which is closely related to the data sources used to train the model: users will use and appreciate the new system if (and only if) it actually turns out to be better than Google. Beyond the error made in the presentation – far from trivial! – one must then check whether the sources contain errors, outdated information, or algorithmic biases of a discriminatory nature, which are also likely to be perpetrated by the generative model. This issue is further amplified in contexts where accuracy of information is vital, such as in medical or legal fields and in YMYL fields in general.

In light of these considerations, then, and contrary to what many people online are writing, for Giuseppe Liguori “SEO is dead nor is it destined to die”: as long as there is an engine behind it there will always be a need to compete and optimize pages to try to do something better than competitors to be chosen as a result.

With a system based on vector databases, SEO professionals will need to develop new strategies to create content that responds precisely to user queries, as hypothesized and described by the aforementioned “SEO for AI” by Ivano di Biasi.

SEO is not dead (and neither is Google)!

“How long until it replaces even Google for search?” wondered Ivano himself in the opening of his new book, and the answer seems to be ”less and less!”

Yet, Ivano himself is keen to reassure everyone about the future of SEO, which will certainly be revolutionized by the introduction of artificial intelligence, but is certainly not destined to disappear – especially, not in the short term.

Just start with a factual consideration: at the moment, Google has over 3.5 billion users worldwide, while ChatGPT stops at about 300 thousand. To think that everyone will suddenly and immediately abandon Google and switch to SeaerchGPT is foolish and short-sighted – which is why we also need to keep doing traditional SEO in the meantime!

However, SearchGPT and any AI search engines are experimental search engines that need to be tested and refined, and also completely new business models to be established. OpenAI has already indicated that it plans to monetize content and allow advertisements within its search engine, but there is a big difference between creating a new competitor for Google and succeeding in undermining it from its throne.

Besides, Ivano also believes that SEO will continue to exist and evolve because competition for online visibility is a timeless phenomenon. As long as there are companies and content creators who want to stand out and compete, there will always be a need for strategies to optimize visibility in search engines. Regardless of the name or technical specifications, an optimization strategy will always be required to ensure that content reaches the desired audience.

Indeed, this news should offer us a note of optimism rather than negativity: something is finally changing in the search monopoly! Although the impact may be limited at first, there is now a new competitor that may begin to take traffic away from Google, which as we have seen is far from infallible or perfect-and in recent times is almost static, always placing the same sites at the top.

What the SEO of the future will look like

Rather than feeling sorry for ourselves, it’s time to roll up our sleeves and take up the challenge, continuing to innovate and study the best strategies to stay relevant on both Google and the new search engines powered by artificial intelligence.

It takes the creation of an SEO framework for AI, as anticipated in Ivano’s book, which is a crucial step in ensuring content visibility in an increasingly technologically advanced future.

His insight to foresee an AI-specific SEO is not accidental, but a natural evolutionary step for those, like SEOZoom, who have been working in the field of search for years: this new era requires a deep understanding of the function of AI in search, and SEOZoom has committed itself to foreseeing and adapting to these future needs through constant research and innovations in the field of SEO.

For example, in the book Ivano performed practical tests and experiments by installing a search engine locally and manipulating the results to understand how to rank certain pages over others, demonstrating that it is possible to influence results in an AI context. However, this is just the beginning: the coming AI-based search engines will require new optimization strategies and techniques, and SEO will have to adapt to this new reality.

How SEO changes with AI

For Ivano, one of the main differences will concern the importance of the technical structure of websites, which will significantly reduce. In this new context, content will really become the focus of SEO: AI-based search engines will place more weight on copywriting and the internal organization of content. AI algorithms will be able to understand the relationships between various documents and place them in a complex vector space.

Registrazione
Get ready for SEO for AI!
Ivano Di Biasi’s book predicts the SEO of the future

In this vector space, texts are transformed into numerical vectors that populate a multidimensional graph: when a user asks the search engine a question, the AI examines this vector space to find the most relevant content. It is like using a pen to fish for information in a virtual paper: having more in-depth content on a specific topic increases the likelihood that the answers will be exactly what is sought. This approach makes the production of rich and detailed content essential, since occupying more vector space means more opportunities to be selected by AI algorithms.

Competition and the need to invent strategies to gain visibility will continue to drive innovation in the industry: with AI becoming more central, SEO professionals will need to adapt and leverage new technologies to create high-quality content optimized for the new search ecosystem.

Supportive tools are already there: for example, SEOZoom’s Question Explorer redefines the approach to keyword research, shifting the focus from traditional keywords to real user questions. This makes it possible to create targeted content that responds directly to users’ needs, thereby improving the odds of ranking – already now, and even more so with SearchGPT and related.

Using this tool, the other SEOZoom tools, and the next ones to be developed not only allows us to stay competitive today, but also prepares us to optimize our content so that it is selected as relevant responses by artificial intelligence, ensuring a high level of optimization in the SEO of the future.

Ultimately, we must strive to stay current and curious-to use a phrase as iconic as it is obvious, “stay hungry, stay foolish”-to continue to produce content and make it relevant in the ever-changing landscape of SEO.

Iscriviti alla newsletter

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP