In 2024–26 the emergence of large language models and answer engines triggered headlines proclaiming the “death of SEO.” Generative AI systems such as ChatGPT, Google’s Search Generative Experience (SGE) and Perplexity can answer questions without sending users to the open web. Fear‑driven narratives assume that search demand has vanished and that optimising content is impossible because the algorithms are opaque. In reality, generative systems sit on top of the same information infrastructure that powers conventional search. They retrieve documents from the index, interpret those documents and then generate an answer. Understanding that pipeline is key to dispelling myths about SEO’s future.
Myth #1 – “AI Will Kill SEO”
Why search demand hasn’t disappeared
Publications claiming that generative search “kills” SEO often cite dramatic traffic drops. A large Graphite/Similarweb study of 40 000 major websites found organic search traffic declined just 2.5 % year‑over‑year when comparing February 2024–December 2024 with January 2025–November 2025. The same study observed that in August 2025 ChatGPT recorded 5.8 billion visits while Google saw 83.8 billion visits – a ratio of roughly 1 : 14; Google processes about 14 billion searches per day versus ChatGPT’s 66 million search‑like prompts. Search‑engine traffic overall grew 0.4 % in 2025 and Google’s traffic increased 0.8 % compared with 2024. Google still holds over 90 % global market share and saw only a 0.3 % decline in deduplicated visitor count. In other words, demand for search remains enormous.
Tony Wright’s 2025 essay, The Great SEO Panic, noted that AI‑powered search drives less than 1 % of web traffic and that people still click on search results because generative answers often prompt them to verify facts or transact. The ALM analysis also found that AI Overviews reduce click‑through rates by roughly 35 %, yet they appear in only about 30 % of queries and mostly for informational keywords. High‑intent commercial searches remain largely unaffected. These figures show that search traffic has not collapsed.
AI still needs sources, structure and signals
Generative systems do not conjure answers from thin air – they retrieve information from indexed web pages. GeoReport’s 2025 guide explains that Google’s SGE continues to rely on link graphs, crawlability and contextual relationships between domains, and OpenAI’s retrieval models (powering ChatGPT and Perplexity) still retrieve from index‑based link networks before generating responses. If a page is not crawled or indexed, the AI cannot use it.
Search didn’t die with mobile, voice or featured snippets – it adapted
Past “SEO is dead” cycles illustrate how search evolves. Search Engine Land notes that hype around social signals (Google+), mobile‑first indexing, voice search and Core Web Vitals created panic, yet these changes merely shifted best practices. Voice search never replaced typed queries, and mobile‑first indexing rolled out gradually. Similarly, featured snippets and rich results reduced some clicks but improved user experience. Each change required adaptation, not abandonment.
Reality: SEO is the input layer for AI
Generative search sits atop the classic SEO stack. LinkGraph’s 2026 guide emphasises that the retrieval phase of AI Overviews uses Google’s existing indexing and ranking infrastructure; being within the top 10–20 search results is a prerequisite for citation, although not a guarantee. GeoReport explains that the base layer (SEO) still manages the link graph and crawlability; the middle layer (GEO) forms knowledge‑graph connections; and the top layer (AEO) is where generative AI decides which entities to cite. If your content is not discoverable, AI cannot retrieve it. Thus, SEO remains the foundational input layer for AI answers.
Myth #2 – “You Can’t Optimise for AI Because It’s a Black Box”
Confusing opacity with being unoptimisable
Generative models are opaque, but observable patterns still exist. A Botify/DemandSphere study summarised by Search Engine Journal found that AI Overviews appear in roughly 47 % of Google searches and cite sources in approximately 75 % of cases from pages already ranking within the top 12 organic results. Ahrefs’ analysis of over one million SERPs reported that AI Overviews appear in 30 % of queries, mostly informational, and that the click‑through drop is modest. These findings show that ranking pages and clear citations correlate with AI visibility. Optimisation has always required inference; we have worked with partial visibility since the era of PageRank.
What we can observe: citations, phrasing and repeated sources
When generative answers include footnote links, they reveal which pages are being retrieved. The same study showed that pages not crawled by search engines are never cited even if they contain relevant information. Observing which sources are cited, how answers are phrased and which domains are repeated allows practitioners to reverse‑engineer patterns.
Reality: AI has preferences – and they’re learnable
Retrieval‑augmented generation (RAG) systems rank candidate documents based on relevance, authority, recency and structural quality. Frase’s 2025 guide notes that AI Overviews favour content that already ranks well and exhibits strong E‑E‑A‑T signals and structured data markup. It further explains that RAG systems prefer content that is semantically clear, well‑structured and factually dense. Research from Princeton University shows that once a model perceives a source as reliable, it exhibits source‑preference bias, repeatedly returning to the same source.
In practice this means:
- Clear structure beats vague prose: using headings, lists and schema markup helps AI extract facts.
- Factual, well‑sourced content beats opinion fluff: citations, evidence and author credentials feed E‑E‑A‑T guidelines.
- Trusted domains and entities are reused: building topical authority and consistent entity descriptions increases the chances of repeat citations.
Myth #3 – “Traditional SEO Signals No Longer Matter”
Why this is provably false
Some commentators argue that generative models bypass links and page‑level signals. In reality, the infrastructure remains essential. GeoReport emphasises that SGE still relies on link graphs, crawlability and contextual relationships between domains. Search Engine Land points out that AI search builds on retrieval‑augmented generation; when an LLM needs current web content, it pulls information from search results (often Google or Bing). OpenAI’s documentation confirms that ChatGPT Search historically retrieved from Bing’s index, and there is evidence it now leverages Google results. Without being indexed and ranked, your content cannot be retrieved.
The SEO services market also contradicts the idea that signals no longer matter. The Business Research Company reports that the global SEO services market grew from US$79.45 billion in 2024 to US$92.74 billion in 2025 and is forecast to reach US$173.89 billion by 2029. Demand for SEO is increasing, not declining.
Reality: SEO signals now have two audiences
Traditional signals still help humans navigate pages, but they also enable machines to extract information. Structured hierarchies, internal linking, page speed and clean code help crawlers index content and help AI systems interpret it. Optimising for AI does not replace SEO; it adds a new audience – machines that transform pages into answers.
Myth #4 – “Ranking #1 Doesn’t Matter Anymore”
Ranking alone isn’t enough – but it still helps
In generative search the winner is not always the top‑ranking page, but ranking strongly remains vital. The Botify/DemandSphere study found that 75 % of AI Overview citations come from pages ranked within the top 12 organic results. LinkGraph’s guide notes that being among the top 10–20 search results is usually required for citation, though not sufficient. Meanwhile, SGE research shows that AI Overviews appear primarily for informational queries and high‑intent commercial searches remain unaffected. Thus ranking highly increases the probability of retrieval.
Reality: Ranking is a prerequisite, not a guarantee
Ranking provides the candidate set for retrieval, but AI will only cite pages that offer extractable facts and clear answers. Pages laden with jargon, thin content or confusing structures may rank but fail to be cited. Effective strategy therefore combines traditional ranking factors (backlinks, on‑page optimisation, relevance) with answer‑engine optimisation – ensuring that content is precise, structured and accessible to machines. A page outside the top results will rarely be cited; a page within them can still be overlooked if it fails to provide clear information.
Myth #5 – “Only Big Brands Can Win in AI Search”
Why niche expertise often beats generic authority
Critics claim that generative answers will favour famous brands. Tony Wright observes that small, niche websites continue to rank for local and specialised queries; the myth that only big brands can win is “nonsense”. GeoReport provides a case study showing that when users ask for the best running shoes under $150, chat‑based engines often recommend niche brands like Allbirds, Hoka and On Running, even though traditional search results still list giants like Nike and Adidas. These challenger brands succeed because their product pages use clean schema, answer real user questions and adopt an educational tone.
Reality: AI prefers clear explanations, not just famous logos
Generative engines evaluate semantic clarity, factual reliability and entity consistency. GeoReport explains that AI does not care about a website’s historic backlink profile as much as whether its brand, author and topic appear consistently in factual, verifiable contexts. Smaller brands can outperform large ones by being the best explainer in a narrow space. Structured data, transparent authorship and topical depth give machines confidence to cite a source. This is an opportunity rather than a threat: new entrants can build authority through clarity rather than sheer scale.
What Has Actually Changed
Video: Top SEO Myths Busted — What Really Matters in the AI Era
The rise of generative engines is not the end of SEO but an expansion of it. Key shifts include:
- From keyword matching to intent and entity understanding: LLMs interpret user intent and map queries to topics and entities rather than exact strings. Optimisation now involves aligning content with search intent, using descriptive headings and including relevant entities.
- From ranking pages to selecting sources: Classic SEO prioritises ranking; AI search selects and synthesises sources. Relevance, authority and structure remain important, but the goal is to be selected and cited rather than merely clicked.
- From traffic‑only metrics to influence and visibility metrics: Marketers must track AI citations, tone of mention, factual accuracy and presence across models. In 2025 67 % of organisations worldwide adopted LLMs and 63 % of marketers prioritised generative search optimisation, signalling that GEO metrics are becoming mainstream.
What Optimisation Looks Like Now
To thrive in the AI era, practitioners should augment classic SEO with generative‑engine optimisation:
- Create answer‑first content: Lead with succinct, fact‑filled answers before exploring nuance. Use clear headings, lists and tables to make information extractable.
- Strengthen internal linking and topical depth: Build content clusters around entities and link related pages to demonstrate expertise and help machines navigate context.
- Enhance entity clarity and schema: Use schema.org markup, consistent naming and author bylines to help AI map facts to entities.
- Prioritise freshness, accuracy and trust signals: Keep information up‑to‑date and support claims with credible sources. E‑E‑A‑T (experience, expertise, authoritativeness, trustworthiness) matters both for human trust and machine interpretation.
- Monitor AI visibility across engines: Track how your brand appears in ChatGPT, Claude, Gemini and Perplexity. PR News advises treating GEO as a core discovery channel and monitoring visibility across multiple engines.
- Align link‑building with AI comprehension: Backlinks still provide discovery, but their value depends on whether the destination content is machine‑legible. Focus on acquiring links from relevant contexts and ensure the linked page is structured for AI.
Why “Doing Nothing” Is the Riskiest Strategy
It is tempting to dismiss AI search as hype and continue business as usual, but passivity carries risks:
- Competitors will shape AI answers: If rivals provide structured, authoritative content while you rely on legacy SEO, AI engines will learn from them. Losing citation status cedes narrative control without an immediate traffic drop.
- Traffic decline can lag behind influence decline: Search traffic may appear stable even as AI engines stop citing your brand. By the time clicks fall, reputational damage may already be done.
- Visibility gaps compound over time: Princeton research on source‑preference bias shows that once an AI model deems a source reliable, it reuses it. Early adopters can lock in visibility, while laggards struggle to catch up.
The Right Mental Model Going Forward
SEO is not dying – it is expanding. Generative Engine Optimisation (GEO) and Answer Engine Optimisation (AEO) sit atop classic SEO foundations, as illustrated in the layered diagram above. Optimisation did not disappear; it gained new surfaces. Think of SEO as ensuring your content is discoverable by search engines. GEO ensures your information is interpretable through structured data and entity clarity. AEO ensures your expertise is visible in AI answers. These layers reinforce one another. Ignoring any layer weakens the rest.
Conclusion
AI has not killed SEO – it has exposed weak SEO. Evidence shows that search demand remains high, that AI systems still depend on crawled and ranked pages, and that optimisation remains both possible and measurable. The winners will not be those who panic, but those who adapt early: publishing clear, factual content; structuring it for machine understanding; and measuring visibility across both search and generative engines. The discipline of optimisation endures – it simply evolved from being solely about ranking webpages to also being about influencing the answers that machines provide.