Journalism Enters a Post-Search Era Driven by Generative AI: VML
By Diego Valverde | Journalist & Industry Analyst -
Wed, 02/04/2026 - 11:30
The integration of Generative AI and algorithmic updates at the end of 2025 have shifted the traditional search model toward an ecosystem of information synthesis, forcing the media to restructure its model of visibility, authority, and journalistic production in the face of the global crisis of truthfulness.
According to VML’s recent report The Future 100: 2026, AI is making it impossible for 71% of the population to understand what is true in the world. Truth has become a threatened concept that demands a new literacy of truth and technical traceability of information.
The media ecosystem is undergoing a structural transformation driven by the convergence of Generative AI, the saturation of synthetic content, and an increasingly strict regulatory framework. In Mexico, this scenario is manifested through a transition from launch journalism to regulatory and technical journalism. The need to explain privacy, competition, and consumer rights regulations has raised the standard of analysis required in newsrooms.
The relevance of this change lies in the erosion of public trust. According to UNESCO data, journalists face a constant flow of misinformation powered by AI tools capable of generating hyper-realistic deepfakes and “information loops”, as in the case detected in 2024 against the France 24 network, in which a journalist was falsely impersonated. This vulnerability is exacerbated by economic pressures: falling advertising revenues and the dominance of digital platforms have reduced editorial teams, facilitating the spread of unverified narratives.
"AI can process massive volumes of data quickly, but trust depends on clear standards and human oversight. Without this, credibility, a fundamental pillar of democracy, is at risk," warns a study by UNESCO on the future of journalism.
From SEO to GEO
The disruption of AI has redefined the rules of digital visibility. The traditional model of Search Engine Optimization (SEO), focused on positioning links in a list, is being replaced by Generative Engine Optimization (GEO), a set of techniques designed to improve the likelihood that content will be referenced in generative search engines. This discipline seeks to ensure that a media outlet's content is selected, cited, and synthesized by assistants such as ChatGPT, Gemini, or Perplexity.
Data from February 2025 indicates that 41% of the general population in markets such as Spain uses ChatGPT weekly, a figure that rises to 61% in the 18-24 age group. This change in habit reduces direct organic traffic (click-through rate), as generative engines offer complete answers without the user having to access the original source.
According to Telefónica, an efficient GEO strategy for Editorial Teams consists of the following aspects:
-
AI-friendly content: Use of precise technical language, verifiable statistics, and citations from high-authority sources.
-
Semantic structure: Implementation of structured data, tables, and FAQs that facilitate indexing by intelligent agents.
-
Presence in Primary Sources: AI prioritizes repositories with high data density. Semrush research, for example, shows that Reddit (40.1%) and Wikipedia (26.3%) are the most cited sources by language models thanks to its collaborative formats involving millions of users, followed by patent repositories and digital libraries such as Scribd.
The Convergence of the Digital World and Physical Approach
Google ended 2025 with its most volatile core update to date. The fundamental change lies in how the algorithm evaluates the "Experience" signal within the E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) framework.
Following the rollout from Dec. 11 to 29, 15% of the pages that ranked in the Top 10 disappeared from the first 100 results. The relevance metric no longer rewards the mere accumulation of keywords, but rather the demonstration of real interaction with the topic. For example, in the analysis of products such as the Samsung S25 or the iPhone 8, Google now prioritizes content that details field tests, specific limitations, and unexpected findings over generic technical summaries.
Faced with the accelerated digitization of users and the automation of search engines, the industry has identified a strategic refuge: direct observation journalism. El País argues that "what AI cannot do is be out on the street, observe what is happening, talk to people, or describe an atmosphere."
The ability to obtain data that does not exist in Google's index or in the training datasets of these language models is the ultimate competitive advantage. While AI collects and formats existing information, journalists generate new information. This distinction can be critical to the survival of the media, as Google has improved its ability to detect and penalize mass-produced AI content with minimal human intervention.
In addition to these two key factors, the sustainability of newsrooms now depends largely on the creation of high-quality evergreen content. This type of content, which focuses on maintaining its relevance over time, responds to timeless and structural search intentions, and is best interpreted and summarized by AI-based search engines, according to IEBS Business School.
For an article to be considered a reference by AI, according to IEBS, it must have thematic depth that answers any questions related to a specific topic, natural optimization through real questions and conversational language, and substantive updates in cases where it is updated on a daily basis.
The future of journalism will be defined by "forced adaptation." The industry must transition to a model where AI acts as a support tool for low-density tasks (transcription, trend analysis, and summaries) while human capital focuses on research and verification.
The main challenges include:
-
Erosion of the Business Model: The reduction in clicks due to direct AI responses (zero-click searches) has a direct impact on programmatic advertising revenue.
-
Language Sovereignty: Linguistic turns of phrase typical of language models are infiltrating into journalism, diluting editorial identity and stylistic uniqueness, reports the Luca de Tena Foundation.
-
Regulation and Ethics: In Mexico, the evolution of regulations on biometric data and AI forces the media to act as translators of complex issues for the public.
Editorial teams are being urged to adopt standards such as those of the Coalition for Content Provenance and Authenticity to certify the human or assisted origin of each piece. They can also invest in specialization, as generalist journalists are the most vulnerable to automation. Technical, legal, or economic specialists are the ones who provide value that AI cannot yet synthesize without primary sources. Finally, the media must educate their audience in the analysis of visual and textual narratives to distinguish hyperreality from verified reality.








