Posts tagged "Google Search & SEO"

Google FAQs

Google is scrapping FAQ rich results: what it means for your website

May 15, 2026 Posted by Sean Walsh Round-Up 0 thoughts on “Google is scrapping FAQ rich results: what it means for your website”

Google confirmed last week that it will no longer support FAQ rich results in search. The expandable question-and-answer panels that appeared directly beneath certain search listings are being removed, along with the Search Console features that allowed webmasters to monitor their performance. For any business that invested time in implementing the FAQ schema, this is worth understanding clearly: what is changing, what it means in practice, and whether any of that investment needs to be redirected.

The short answer is that this change is less damaging than it might initially sound. But there are some useful lessons in it about how to think about structured data more broadly as Google continues to adjust what it surfaces and how.

What FAQ rich results actually were

FAQ rich results were the expandable accordions that appeared below a search listing, showing individual question-and-answer pairs pulled directly from a page’s structured data. When implemented correctly using FAQ schema markup, they could significantly expand a search listing’s visual footprint on the page, making it more prominent without requiring a higher ranking.

Google started restricting FAQ rich results in 2023, limiting them to government and health websites. The full removal confirmed last week completes that process. The associated Search Console report, which showed impressions and clicks from FAQ rich results, will be deprecated alongside the feature itself.

How much does this actually matter?

For most businesses, the honest answer is: less than it might appear. By 2023, Google had already restricted FAQ rich results to a narrow category of sites, which means the vast majority of businesses had not been benefiting from them in search results for some time. The visual expansion of a listing was a genuine competitive advantage when the feature was widely available, but that window closed a few years ago.

What is changing now is the formal retirement of a feature that was already largely inactive for commercial websites. The Search Console report being removed is a minor practical inconvenience if you were still tracking FAQ impressions, but the disappearance of the report does not reflect a loss of current traffic or visibility for most sites.

Should you remove your FAQ schema?

Not necessarily, and in many cases the answer is no. This is an important distinction. FAQ schema no longer produces rich result panels in standard Google Search, but structured data serves multiple purposes beyond generating visual enhancements in the search results page.

As we covered recently when looking at the tools worth using for AEO, structured data is one of the cleaner signals available for AI retrieval. FAQ schema specifically provides AI systems with a machine-readable layer of question-and-answer content that can inform how those systems cite and use your content in AI-generated answers, entirely separate from whether Google renders it as a rich result in traditional search.

The argument for keeping well-implemented FAQ schema in place is straightforward. It costs nothing to maintain, it provides clarity to search engines and AI systems about the purpose and structure of your content, and removing it does not guarantee any benefit. Unless your FAQ schema is technically broken or generating errors in Search Console, the case for removing it is weak.

What to redirect your structured data efforts towards

If this news prompts a review of how structured data is implemented on your site, that is a worthwhile exercise. The specific types of schema that continue to produce rich results in Google Search and that are increasingly relevant for AI retrieval are worth prioritising. The most consistently valuable include:

  • Review schema. Star ratings in search results remain one of the highest-impact visual enhancements available and are directly relevant to businesses in sectors such as healthcare, dental, aesthetics and professional services where social proof influences decisions.
  • Local business schema. For businesses with physical locations, accurate and complete local business schema reinforces the information shown in Google Business Profiles and supports consistent citation across AI search systems. This is particularly important for multi-location businesses and sectors where local search intent is high.
  • Product schema. For e-commerce businesses, product schema supporting price, availability and review information continues to produce rich results in both standard search and Shopping. This remains one of the most directly commercial schema types available.
  • Article and breadcrumb schema. These support how content pages are understood and indexed, contributing to cleaner crawling and more accurate representation in search results, and remain relevant for content-heavy sites.
  • How-to and event schema. Both continue to produce rich results for relevant content types and are worth implementing where the content justifies it.

The broader pattern worth noting

The removal of FAQ rich results is part of a longer pattern. Google has been progressively reducing the variety of rich result types it surfaces in standard search as AI Overviews and AI Mode take up more of the results page. The shift towards AI-generated answers changes what it means to be visible in search. The visual real estate previously occupied by FAQ panels, knowledge panels and similar features is increasingly being consumed by AI-generated content instead.

This does not mean structured data is becoming less important. If anything, the opposite is true. As AI systems take a more active role in assembling answers from multiple sources, the clarity and accuracy of the signals you provide about your content become more valuable, not less. The mechanism by which those signals produce a visible result in search is changing. The underlying importance of giving search engines and AI systems well-structured, accurate, machine-readable information is not.

What to do now

A few practical steps are worth taking in light of this change.

  • Check your Search Console for any FAQ rich result errors or warnings. These will stop being reported when the feature is deprecated, but addressing any existing errors is good practice before the report disappears.
  • Audit your current structured data implementation more broadly. If FAQ schema was the only schema type in use on your site, now is a good time to review whether review, local business or product schema should be added where applicable.
  • Do not remove FAQ schema solely because of this change. If your FAQ schema is well-implemented and error-free, leave it in place. Its value in an AI retrieval context is not affected by Google’s decision to stop rendering it as a rich result.
  • Focus new structured data investment on the types that continue to produce visible results: review, local business, product and event schema, depending on what is relevant to your business.

The disappearance of FAQ rich results is worth knowing about, but it is not a reason to panic or make sweeping changes. It is a reminder that search is a changing environment and that the value of any single technical feature is always temporary. The underlying principle, making your content clear, accurate and well-structured, remains constant regardless of which specific features Google chooses to support at any given time.

links

New Citation Opportunities in AI Overviews

May 8, 2026 Posted by Matthew Widdop Round-Up 0 thoughts on “New Citation Opportunities in AI Overviews”

Google has announced that citations within AI Overviews and AI Mode will now have new formats and displays which will hopefully boost CTR, making AEO even more crucial for business and their digital presence moving forward. In this article we’ll go through the 5 different ways Google has announced they will display links in 2026.

Further Exploration

Google have introduced a “further exploration” section to AI Overviews that will appear at the bottom of AI responses that encourages further reading on websites that are topical authorities on a specific subject. This is one of the first indications of AI Overviews being used to truly being used to platform links rather than use them subtly like in previous iterations of AI Overviews. Getting into these further exploration links could be crucial for business’ as they are the most clearly displayed links to date yet in AI Overviews and if they appear often, business will be pushing to ramp up their AEO

1. Explore New Angles

Subscriptions Links

Subscription linking allows readers who pay to read your content link their subscriptions to their Google account. These subscriptions will then appear in AI Overviews, as seen below, so people can see if information has come from one of their trusted sources, and improving the chances of CTR. This is mainly going to be a good feature for news publications that use subscription models to allow people access to their content.

Subscription Links

Community Advice

A lot of people, when searching online, especially when asking question based searches want to seek out advice from others who have shared similar experiences. Attaching “reddit” to the end of searches, is a tactic from users to often seek out advice from others.

Now AI Overviews will include previews from of public perspectives and online discussion from communities for certain, question based searches. These new forms of links will help community based cites and forums get more links in AI overviews.

Get Advice From People Who Have Been There

Expanded Links within Content

AI Overviews already includes links throughout content so you can dig further into research as your reading through AI content. The latest update still allows for seeing links directly where you need them but now these links will just be more prevalent in AI searches.

Website Hover

When you hover over an inline link the website that the information is being pulled from will now appear with their web icon, page and website name attached. Once again this feature is just building out on existing linking features to make websites more obvious in AI Overviews and improve CTR.

Get More Context On Linked Websites

A lot of these new features being added into AI Overviews and AI Mode will improve click-through-rate to websites, which means being seen and recognised as a topical authority by Google and appearing in AI Overviews will be more important than ever going forward.

Meta vs Google

Meta vs Google: What the Shift in Digital Advertising Means for Brands

May 8, 2026 Posted by Liam Walsh Round-Up 0 thoughts on “Meta vs Google: What the Shift in Digital Advertising Means for Brands”

For years, Google has dominated the digital advertising world. But according to recent forecasts from EMARKETER, 2026 could mark a major turning point: Meta is expected to surpass Google in digital ad revenue for the first time.

At Intelligency, we see this as more than just an industry headline. It reflects how consumer behaviour, AI, and advertising performance are evolving and why brands need to rethink where they invest their marketing budgets.

Why Meta Is Growing So Quickly

Meta’s projected ad revenue is expected to reach $243.46 billion in 2026, slightly ahead of Google’s forecasted $239.54 billion.

Platforms like Facebook and Instagram are becoming increasingly effective at helping advertisers reach the right audiences at the right time. Meta’s AI-powered tools, including Advantage+ and automated creative optimisation, are making campaigns easier to manage while improving results. Reels has also become a major driver of engagement and ad growth.

In simple terms, advertisers are seeing stronger returns from Meta’s ecosystem, and they’re increasing spend accordingly.

What This Means for Businesses

This shift doesn’t mean Google advertising is disappearing. Search ads still play a critical role, especially when users are actively looking for products or services.

However, Meta’s rise highlights an important trend: discovery-based advertising is becoming just as valuable as intent-based advertising. Consumers are increasingly discovering brands through social content rather than traditional search alone.

For businesses, that means successful digital marketing strategies can no longer rely on one channel. Brands need a balanced approach that combines search, social, video, and AI-driven targeting.

The Bigger Picture

Another important takeaway is market consolidation. EMARKETER predicts Meta, Google, and Amazon will collectively control over 62% of global digital ad spend in 2026.

For marketers, this creates both opportunity and competition. The platforms are becoming more sophisticated, but standing out requires stronger creativity, smarter targeting, and continuous optimisation.

At Intelligency, we help brands navigate exactly this kind of change, turning industry shifts into practical growth strategies that deliver measurable results.

Tools

The AEO tools worth using right now

May 8, 2026 Posted by Sean Walsh Round-Up 0 thoughts on “The AEO tools worth using right now”

Agentic Engine Optimisation, or AEO, has become one of the more discussed topics in digital marketing in 2026. The coverage has generated a fair amount of heat but not always much light, partly because the tooling landscape is still developing and partly because a lot of what is being written treats AEO as something entirely new when, in practice, it shares most of its foundations with work that good SEO practitioners have been doing for years.

This article covers the tools that are actually useful right now, split into those with a clear proven use case and those worth testing as the category matures. But before getting into the tools, it is worth addressing the terminology question directly, because the shift in how AI systems retrieve and surface content has caused some confusion about what has actually changed and what has not.

AEO and SEO: different names, same foundation

Despite what some of the more excitable coverage suggests, AEO is not a replacement for SEO, and it does not require a completely different strategic approach. The two disciplines share the same core premise: make your content easy to find, easy to understand, and clearly relevant to what someone is trying to know or do.

The difference is in the audience being optimised for. Traditional SEO focuses on helping human searchers find and engage with your content. AEO extends that to cover AI systems that fetch, parse and use your content to formulate answers, often without a human ever clicking through to your site. The signals that matter are largely the same: well-structured pages, clear headings, specific and accurate information, strong authority and credible backlinks. What changes is the layer of intent you apply when thinking about how that content is read and used.

Research consistently shows that AI Overviews and similar systems draw heavily from pages that already rank well organically. Google has been clear for some time that content quality and audience relevance are the primary factors that determine whether a site performs. AEO does not change that message. It adds a layer of consideration: once your content is authoritative and well-structured, is it also legible and extractable for AI systems working within processing constraints?

In practical terms, we treat AEO and SEO as the same discipline. The same improvements that help AI systems cite your pages accurately also make those pages clearer and more useful for human readers. They are not in tension. The tools below reflect that: some are well-established SEO tools that remain highly relevant in an AEO context, and some are newer platforms built specifically to measure AI search visibility.

The tools with a clear, proven use case

1. LLM assistants used with a defined methodology

ChatGPT, Claude and Gemini are themselves useful AEO research tools when used intentionally rather than ad hoc. The most practical applications are competitive landscape research, content gap analysis, prompt testing to understand how AI platforms respond to queries in your category, and entity and topical coverage audits.

Asking an LLM what it knows about your brand, your competitors and your sector, and then interrogating where the gaps and inaccuracies are, is a fast and accessible way to understand your current AI search position. Most businesses have not done this basic audit, and a meaningful first pass takes less than an hour. It is also one of the few AEO-relevant activities that costs nothing beyond the time invested.

2. Google Search Console

Search Console remains one of the most important tools in any AEO workflow, primarily because it provides direct performance data from Google: the platform that produces the AI Overviews appearing above organic results for a significant proportion of searches. Understanding which of your pages are currently being surfaced in AI Overviews, and which are not, gives you a baseline from which to measure the impact of content changes.

It also helps identify the queries where AI Overviews are appearing for keywords you rank for, which is increasingly important as AI-generated answers above the fold reduce click-through rates on the organic listings below them. Knowing where you are losing clicks to AI answers on your own target keywords is the starting point for deciding where to focus content improvement efforts.

3. Google Trends

Google Trends serves a different purpose than Search Console, but it is equally valuable for AEO strategy. Where Search Console tells you how you are performing, Google Trends tells you where demand is heading. It does not give absolute search volume, but it gives relative momentum across topics and queries, which is often more strategically useful when trying to get ahead of emerging patterns rather than simply responding to existing ones.

For AEO specifically, rising query trends can signal emerging answer opportunities you can address before your competitors do. AI systems tend to favour content that is well-established and authoritative on a topic, which means the window for getting in early is narrow. Identifying rising demand trends through Google Trends and creating strong content quickly is one of the more practical ways to build AI citation presence in a new area before it becomes competitive.

4. SE Ranking and SE Visible

SE Ranking is the platform we use day-to-day for client SEO work, and its relevance to AEO has grown considerably over the past 12 months. The AI Overviews Tracker monitors how your keywords are performing within Google’s AI-generated results, including citation frequency, source analysis, and estimated traffic impact from AI Overviews. It also identifies which competitor domains are being cited in AI answers for keywords you are targeting, which is actionable competitive intelligence.

The AI Search Toolkit extends this further by tracking brand mentions and linked citations across AI Overviews, AI Mode, ChatGPT, Gemini and Perplexity. You can monitor how often your domain is cited, whether citations are linked or unlinked, and how this compares to named competitors over time.

SE Visible is a companion product that sits alongside SE Ranking and focuses specifically on brand AI visibility at a strategic level: how your brand is presented, ranked and perceived across AI systems. It provides a Brand Visibility Index that measures performance over time and competitive benchmarking across AI platforms. For agencies managing multiple client accounts, the combination of SE Ranking for tactical execution and SE Visible for strategic oversight is a coherent and cost-effective approach.

5. Semrush

Semrush has expanded its feature set to include AI Overviews tracking and visibility data, making it one of the more complete tools for monitoring how content performs across both traditional search and AI-generated results within a single platform. For teams already using Semrush for keyword research, position tracking and site auditing, the AI visibility layer adds meaningful value without requiring a separate tool or workflow.

The topic clustering and content gap analysis features are particularly relevant for AEO, helping identify where topical coverage is thin relative to what AI systems are pulling from competitors. Thin or fragmented coverage in a topic area is one of the more common reasons a site gets passed over in AI-generated answers in favour of a competitor with more comprehensive, well-organised content on the same subject.

6. Profound

Profound is purpose-built for AI search monitoring. It tracks how platforms, including ChatGPT, Perplexity, Google AI Overviews, and Claude discover, surface and cite your brand and content. It monitors brand mention frequency and sentiment, competitor share of voice, and the specific prompts that trigger your content to appear in AI-generated answers.

The most useful shift Profound enables is in the metric itself. Rather than asking where you rank in a search result, you can ask: when AI answers a question in your category, are you in the answer? The cross-platform view, covering multiple AI engines simultaneously rather than one in isolation, is its most distinctive feature and makes competitive benchmarking significantly more meaningful than single-platform tracking.

It is not a cheap tool and is better suited to businesses with an existing content and SEO foundation. For agencies managing multiple clients in competitive sectors, the monitoring and benchmarking functionality is particularly valuable.

7. Screaming Frog

Screaming Frog has been a technical SEO staple for years, and its relevance extends directly into AEO. Many of the technical issues that prevent AI agents from correctly parsing and using your content are exactly the issues Screaming Frog identifies: missing or misconfigured structured data, poorly structured heading hierarchies, thin or duplicated page content, and slow server response times.

Running a Screaming Frog audit with a focus on schema markup completeness, heading structure, and page-level content depth is one of the most practical first steps in any AEO improvement programme. The tool now integrates with Google Search Console and PageSpeed Insights, making it straightforward to cross-reference technical findings with actual performance data.

8. Google Rich Results Test and Schema Markup Validator

Structured data is one of the cleaner signals available for AI retrieval. Schema markup for FAQs, services, reviews, products and local business information gives AI systems a reliable, machine-readable layer of data to draw from, independent of how the surrounding content is written or formatted. Getting this right is a relatively contained piece of work that can have a disproportionate impact on how accurately your content is cited.

Both tools are free. The Rich Results Test checks whether your structured data is correctly implemented and eligible for enhanced display in search results. The Schema Markup Validator checks for errors and warnings at a more granular level. For businesses in sectors where FAQ, review or service schema are applicable, a structured data audit is one of the most immediately actionable AEO improvements available.

How to approach this practically

The AEO tools market has grown faster than the evidence base for what actually works. Many platforms are repackaging existing SEO or content analytics functionality under AEO branding without meaningfully changing what they measure. The most reliable signal for whether a tool is genuinely useful is whether it changes a specific decision you make about your content or your site.

A practical starting sequence looks like this. Use an LLM to audit your current brand position across AI platforms in your category. Use Google Search Console to understand which of your pages are appearing in AI Overviews and where the gaps are. Use Google Trends to identify rising demand patterns worth targeting early. Use Screaming Frog and the schema validation tools to fix any technical issues preventing your content from being correctly parsed. Then use SE Ranking, Semrush or Profound, depending on the depth of monitoring your situation requires, to track how your visibility is changing over time.

Starting with the fundamentals, well-structured content, strong authority signals, accurate structured data, and a clear technical foundation will deliver more impact sooner than any monitoring platform can on its own. The monitoring tools tell you whether the work is making a difference. They are not a substitute for doing the work.

web for ai agents

Why your website content needs to work for AI agents, not just humans

May 1, 2026 Posted by Sean Walsh Round-Up 0 thoughts on “Why your website content needs to work for AI agents, not just humans”

Search behaviour is changing in a way that has direct consequences for how websites should be built and written. AI-powered tools are increasingly doing the searching on behalf of users. Google’s AI Overviews, ChatGPT search and Perplexity all fetch pages, read the content and formulate a response without the user ever clicking through to your site. For that process to produce an accurate, useful answer that includes your business, your content has to work for a very different kind of reader.

This is not entirely new territory. Google has been signalling for some time that content quality and clear audience focus are what separates sites that perform from those that do not. The shift now is that quality alone is not enough. The structure and economy of that content is becoming just as important as its substance.

Addy Osmani, Director of Engineering at Google Cloud AI, published a detailed framework this month on what he is calling Agentic Engine Optimisation, or AEO. The core argument is that websites need to be structured not just for human readers but for AI agents that fetch, parse and act on page content within strict processing limits. It is a practical discipline, not a theoretical one, and the implications for how pages are written and organised are significant.

What AI agents are actually doing when they visit your site

A human reader landing on a page will scroll, skim headings, jump to the section that interests them and ignore the rest. An AI agent operating within a limited context window has to make rapid decisions about what to read, what to skip and what to use. That context window is essentially a ceiling on how much the system can process in a single pass.

If a page is poorly structured, heavily padded or buries its key information deep in lengthy paragraphs, the AI agent may truncate it, skip sections or draw on incomplete information. The result is inaccurate or partial answers attributed to your business, or no citation at all. Neither outcome is useful.

The framework Osmani outlines treats token count as a practical constraint. Concretely, introductory or quick-start sections should sit at roughly 15,000 tokens or fewer. Longer conceptual content should be proportionate to what it genuinely needs to cover. Pages that pad out their word count with preamble, repetitive summaries and hedging caveats are less reliably processed and cited.

Why does this change the case for shorter, tighter content

For years, the conventional wisdom in SEO was that longer content ranks better. That was never the full picture, but it did lead to a culture of padding that is now actively counterproductive. In an AI-driven search environment, a well-structured 900-word page will frequently outperform a sprawling 3,000-word equivalent that takes too long to get to the point.

The issue is compounded by the volume of generic, low-quality content that has flooded the web as AI writing tools have become more accessible. Content that follows a template, repeats itself and adds little original value is increasingly easy for both human readers and AI systems to identify and discount. The bar for what counts as genuinely useful content is rising, not falling.

Length should serve clarity. A page that answers a specific question concisely, uses clear headings and organises its information logically will perform better in AI retrieval than one that inflates its word count in the hope of appearing more authoritative.

What this looks like across different sectors

The practical implications vary by sector, but the pattern is consistent. Most website content has been written primarily for human browsers using a format that does not always translate well to AI retrieval.

  • Dental and healthcare practices. Treatment pages often carry a great deal of clinical reassurance and general background before they reach the specific details of cost, process and expected outcomes. That structure works for an anxious patient reading carefully. It is less effective for an AI agent trying to extract a direct answer to ‘what does dental implant treatment cost in London’. Moving key information closer to the top of the page benefits both audiences.
  • Aesthetics and beauty clinics. Sites in this sector frequently describe the same treatment across multiple pages with minor variations in wording. That creates ambiguity for AI systems trying to identify the most authoritative version of a page. Consolidating treatment content and making each page more definitive reduces that ambiguity and improves citation potential.
  • Education and training providers. Course pages that include structured information about learning outcomes, duration, entry requirements and course content are well suited to AI retrieval. Where they tend to fall down is in using internal terminology, acronyms or jargon that an AI system may not parse correctly.
  • Professional services and B2B businesses. Service pages that are vague about what is offered, who benefits and what the process involves are difficult to interpret for human readers and AI alike. Being specific and outcome-focused is a straightforward improvement that helps on both fronts.

Practical changes to prioritise

None of the following requires a site rebuild. These are targeted adjustments that improve how content is processed by AI systems without compromising the experience for human readers.

  • Front-load the key information. Answer the question being asked within the first two paragraphs. Extended scene-setting before reaching the substance increases the risk of truncation in AI-driven environments.
  • Use descriptive, specific headings. Headings are one of the primary signals AI agents use to understand what a section covers. ‘Our approach’ or ‘How we help’ are much harder to interpret than ‘What the treatment involves’ or ‘How long results typically last’.
  • Remove unnecessary repetition. Pages that repeat the same information across multiple sections, or that include lengthy disclaimers before any substantive content, dilute the signal-to-noise ratio for AI retrieval systems.
  • Make specific facts easy to extract. Prices, timings, eligibility criteria, qualifications and outcomes are the details AI systems actively try to pull. Burying them inside long paragraphs reduces the accuracy of what gets cited.
  • Review structured data. Schema markup for FAQs, services, reviews and local business information provides AI systems with a cleaner, more reliable layer of data to draw from. If this has not been reviewed recently, it is worth prioritising.

AEO extends SEO; it does not replace it

Agentic Engine Optimisation is not a reason to abandon the fundamentals of technical SEO. Strong site architecture, well-structured content, authoritative backlinks and clear crawlability all remain important signals. Research consistently shows that AI Overviews and similar systems draw heavily from pages that already rank well organically. The work already done on search optimisation is not wasted.

What is changing is the standard applied to the content layer. A page that ranks well but communicates poorly in structured, concise terms may start to lose ground to a better-optimised competitor. That gap will widen as AI-driven search becomes more central to how people find information. As AI platforms continue to develop their search and advertising infrastructure, the stakes for being cited accurately and prominently will increase.

The question worth asking of any page on your site is not whether it reads well but whether an AI agent can extract what it needs quickly, accurately and without ambiguity. For many sites, the honest answer is that it cannot, and that is a gap worth closing sooner rather than later.

Core updates

Google’s March 2026 core update was more volatile than December’s, but the warning signs were already there

April 17, 2026 Posted by Sean Walsh Round-Up 0 thoughts on “Google’s March 2026 core update was more volatile than December’s, but the warning signs were already there”

Google’s March 2026 core update appears to have caused a much more dramatic reshuffle than the December 2025 update, with significantly heavier movement across top rankings and far less stability in the results. According to SE Ranking data reported by Search Engine Land, 79.5% of top-three URLs changed position, while 24.1% of pages that were in the top 10 dropped out of the top 100 entirely. That is a much sharper level of disruption than we saw after the December update.

For anyone working in digital marketing, this matters because core updates can quickly affect visibility, traffic and lead generation. They are not niche SEO events. They can alter how discoverable your brand is in Google almost overnight. We have already covered how these kinds of changes can ripple into newer search experiences too, particularly in our article on how AI Overviews are affected by Google core updates.

What is a Google core update?

A Google core update is a broad change to the main systems Google uses to rank content in search. Rather than targeting one specific tactic or technical issue, these updates affect how Google evaluates the overall quality, relevance and usefulness of pages across a huge range of queries.

That is what makes them so important. A core update is not just about whether a site has done something wrong. More often, it is Google reassessing which pages deserve to rank most prominently.

In practical terms, core updates matter because they can:

  • shift rankings across entire sectors, not just individual websites
  • reduce traffic even when nothing on your site has changed
  • reward pages that Google now sees as more useful, relevant or trustworthy
  • affect leads, enquiries and revenue, not just SEO reports

If you need a simpler backgrounder for clients or colleagues, this fits with the same pattern we discussed in our piece on Google’s November 2024 core update and what digital marketers need to know. We also looked at a completed rollout in our coverage of the June 2025 core update.

Why this update matters

The March 2026 update looks more severe than December because the search results were far less stable at every major ranking tier. In the top three positions, only 20.5% of URLs held their exact place, down from 33.1% in December. In the top 10, that dropped to just 9.3%, compared with 16.9% previously. This was not a light reordering of similar results. It was a deeper reset.

That matters commercially too. When pages drop out of the top 10 or disappear from the top 100, the consequences are rarely limited to reporting dashboards. The knock-on effect is often felt in:

  • organic traffic
  • lead volume
  • enquiry quality
  • overall revenue performance

For brands that rely heavily on organic search, a volatile core update can quickly become a much wider business issue.

We were already expecting something like this

What makes this update interesting is that the scale of disruption was large, but the fact that something bigger was coming did not feel completely unexpected.

In the days before official confirmation, many marketers were already seeing unusual ranking turbulence. Search results were moving more than normal, visibility looked unsettled across multiple sectors, and there was a sense that Google’s results were wobbling before the formal announcement arrived. That kind of pre-update instability often points to broader recalibration already taking place.

The logic is fairly straightforward. Google rarely goes from total calm to full-scale disruption with no signs at all. More often, there is a period where results begin to fluctuate, some sectors become noisier than others, and tools start showing elevated volatility before Google confirms a rollout.

That is why this March update felt less like a bolt from the blue and more like a formal confirmation of what the SERPs were already hinting at. In practice, this is one reason it is worth watching turbulence patterns closely instead of relying only on Google’s official announcements.

A useful way to think about it is this:

  • isolated ranking drops can happen for all sorts of reasons
  • wider volatility across sectors is more meaningful
  • sustained turbulence often signals that Google is testing or preparing broader change
  • by the time an update is confirmed, the effects may already be underway

Why the timing made everything feel even noisier

Another reason this rollout felt especially messy is that it came immediately after Google’s March 2026 spam update. That spam update finished unusually quickly, in less than a day, and the core update began just after it. The overlap makes attribution more difficult because some of the disruption may have been amplified by the proximity of both changes.

We covered that separately in our article on Google’s March 2026 spam update, but the key point here is that when two major Google systems shift close together, the search landscape can look more chaotic than usual. That does not always mean every ranking swing should be taken at face value in the moment. It often makes more sense to step back and watch the wider direction of travel once the dust settles.

What marketers should take from this

For marketers, the biggest takeaway is that core updates are not abstract SEO events happening in the background. They shape who gets seen, who loses visibility and which sites Google currently trusts to answer users’ questions.

This also reinforces the importance of building content that offers real value, not just surface-level optimisation. Stronger brands, clearer expertise, original insight and more useful pages are generally better placed to weather this kind of turbulence. That broader shift also connects to what we explored in our piece on keyword research and search intent, because ranking well is no longer just about matching terms. It is increasingly about being the result Google believes best satisfies the intent behind the search.

There is also a wider search context to keep in mind. Google’s search experience is changing beyond the traditional blue links, which is why articles like our look at Google’s move towards AI Mode matter too.

For marketers, the practical takeaway is to focus on:

  • genuinely useful content
  • stronger signals of expertise and trust
  • clearer alignment with search intent
  • a broader view of visibility beyond simple rankings

Final thought

The March 2026 core update looks more volatile than December’s, but the early signs of turbulence suggested Google was already preparing to make bigger changes. In that sense, the update was dramatic, but not entirely surprising.

For digital marketers, the lesson is simple. Pay attention when the SERPs start to feel unusually unstable. Those moments often tell you that Google is preparing to make a broader judgement about what deserves to rank, and by the time the official announcement lands, that process may already be well underway.

I can also make this a bit more punchy and blog-like, with stronger subheadings and a slightly more opinionated tone.

Matty core update

Google’s March Core Update Complete

April 10, 2026 Posted by Matthew Widdop Round-Up 0 thoughts on “Google’s March Core Update Complete”

Google has finished rolling out its latest March core update, following the latest spam update that took place across March 23-24th. The update started on March 27th and has just finished rolling out on the 8th April which means SEOs can now compare pre- and post-update organic performance to see how their sites have been affected.

What this means for Marketers

As with any core update, rankings are likely to change for better or for worse. If your rankings have dipped, it doesn’t necessarily mean you have violated one of Google’s policies, but they are constantly assessing content during the updates to see how appropriate it is for the end user. If your content has dropped in rankings and doesn’t regain positions within the next few weeks, looking at potentially enhancing these pages that have dropped off would be the correct approach moving forward.

Some in the SEO community have noted that the latest core update has not been as powerful as some recent core updates, such as the December 2025 core update, meaning volatile rankings are less likely. Keeping an eye on how your websites perform post-update for any positive or negative bounces is still key.

AsGlen Gabe writes for SERoundtable, “The March 2026 broad core update was a weird one. I’ve been documenting what I’ve been seeing via my “Core Update Notes” on X, and it just didn’t seem as powerful as some previous broad core updates. For example, the December 2025 broad core update was huge. We saw the update land quickly, and it was extremely powerful. The March 2026 broad core update didn’t land quickly and just didn’t seem to be as powerful. Sure, there were definitely sites that saw big surges or drops, but overall, the update seemed less powerful.”

Google didn’t release much information around the latest update, such as any goals the update was hoping to achieve, which can normally be found on their update companion blog. This also leads us to believe that the March Core Update was not as robust as some previous updates.

Google did announce in December that there would be smaller core updates rolling out in the coming year, so the latest March core update is what they were referring to.

Futureproofing against Core Updates

As with any core update, looking ahead into the future to make sure your rankings aren’t affected, there are several steps you can take to combat any large ranking volatility.

Making sure your content is up to date and thorough is one of the most important factors. If your content is thin and new sites are emerging with more robust and informational content, your rankings are likely to slip when a new update rolls around.

Also, make sure you aren’t breaking any of Google’s policies, which can cause extreme damage to Google rankings. Google has several policy guideline documents which take you through how to create good, ethical content. One of the latest policies that has been talked about widely in the SEO community is Google punishing sites that overly rely on AI for content writing, treating it as spam.

Following Google’s policies while creating detailed, accurate content will allow you to future-proof yourself from any core updates coming up.

planning forecasting

GA4 Using Projections to Estimate Data

March 27, 2026 Posted by Matthew Widdop Round-Up 0 thoughts on “GA4 Using Projections to Estimate Data”

Google Analytics 4 is the most dominant analytics platform for website users, with over 32 million live websites currently using the platform and over 78% of websites that report using traffic analysis tools. Over the years, as cookie restrictions and privacy consent have become more stringent, GA4 has faced serious issues with data collection, namely from data being withheld due to a lack of user consent. Having access to accurate data is crucial for website users to be able to make informed decisions on how to improve their site. In this article, we’ll discuss how GA4 have tried to combat incomplete data to improve user data and alternative platforms you can use.

How have GA4 Handled Missing Data?

Google has started to use AI to model how users likely behave on your site with data such as user counts, conversions, and more, all being projected based on similar data from users who have given cookie consent. Estimates suggest this can recover a significant portion of lost data, but there is no way to know how accurate current datasets are. When reporting on conversions, user counts and sessions to clients, data needs to be accurate.

For large data sets, GA4 also uses data sampling. This is where GA4 takes only a small sample of the data before extrapolating it to gain results. This is problematic because even though it makes data more scalable, once again, precision is being lost in data reporting.

Why is Precision so Important in Data Reporting

While it may not seem completely detrimental for GA4 ot be estimating data, if the data still gives you a fairly accurate picture of how your business is performing, there are still limitations to how you can improve performance with estimated as opposed to accurate data.

  • Small Margins. If you’re working with tight margins and your data only changes incrementally, estimates may wipe out these margins altogether, or you may not feel you can rely on them enough as solid data to make decisions that can positively impact your business.
  • Budget Allocation. There are different ways in which to spend the budget to maximise performance. If you’re relying on imprecise data to determine which channels are your strongest performing, then you could be allocating budget to the wrong areas of your site.
  • Forecasting Implication: In understanding how your site is going to perform in the future, you can often use your historical data as a benchmark, for example, understanding peak times of the year when your site is busy and not. If the data is projected, then these forecasts can often be misguided, leading to disappointment.

What alternatives are available?

Here at Intelligency, we have been using Matomo as an alternative to GA4 to report on client data. Matomo describes itself as “a web & app analytics for teams who demand accuracy.” Matomo is still GDPR compliant, but puts data ownership in your hands as opposed to giving it to Google. Matomo allows for increased accuracy in data reporting by only reporting on data that it directly observes and using minimal modelling compared to Google’s model-heavy product, leading to better insights and outcomes.

Spam alert (1)

Google’s March 2026 spam update finished fast. Here’s what marketers should know

March 27, 2026 Posted by Sean Walsh Round-Up 0 thoughts on “Google’s March 2026 spam update finished fast. Here’s what marketers should know”

Google has completed its March 2026 spam update, and unusually, it rolled out in less than a day. The update began on 24 March 2026 and finished on 25 March 2026, making it one of the quickest confirmed Google updates in recent memory.

For anyone working in digital marketing, the main thing to know is that this was not presented as a broad quality update or a major rethink of how search works. Google described it as a standard spam update affecting all languages and all regions. In simple terms, it was designed to improve Google’s ability to detect and reduce spam in search results.

Why this matters

Whenever Google confirms an algorithm update, marketers understandably start looking for changes in traffic, rankings and visibility. That will happen here too. If a website sees a sudden drop or unusual volatility over the next few days, this update could be part of the reason.

That said, Google’s position is fairly clear. This update is meant to target sites and tactics that break its spam rules, not legitimate websites doing sensible SEO, publishing useful content, and earning visibility in a fair way.

So for most brands and marketing teams, this should not be a moment for panic. It is more of a reminder that shortcuts in search still carry risk, and that Google is continuing to tighten its systems.

What Google is actually saying

Google explains that its anti-spam systems run all the time, but from time to time it makes more notable improvements and publicly labels those as spam updates. It also points to SpamBrain, its AI-based spam prevention system, which is regularly updated to detect new forms of manipulation.

For marketers, the important takeaway is this: Google is not only looking for obvious spam. It is continually improving how it spots patterns that appear manipulative, low-quality or designed purely to game rankings.

That could include tactics such as low-value scaled content, manipulative links, doorway pages, hidden content, or other approaches that prioritise search engines over real users.

What to watch in your reporting

If you manage websites, campaigns or client reporting, keep an eye on organic performance over the next several days rather than reacting instantly to one bad day. Look for meaningful movement, not noise.

Pay particular attention to whether:

  • Traffic drops are isolated to a few pages or spread across the site
  • Declines are happening on pages that may have weaker content or questionable optimisation tactics
  • Ranking changes are matched by reduced clicks and conversions, not just position shifts

A short wobble does not necessarily mean a site has been hit. Search results often move around briefly after an update.

What this means for SEO strategy

This update is another sign that the safest long-term strategy remains the same. Brands should focus on content that is genuinely useful, pages built for users first, and search visibility earned through credibility rather than manipulation.

It also reinforces an important point for clients and stakeholders: not every ranking drop is caused by a competitor doing something clever, and not every recovery can be forced quickly. If Google believes a site has spam signals, improvements can take time to be recognised.

Google also notes that with link spam in particular, there is an added complication. If spammy links once boosted a site’s visibility, removing their effect does not restore those gains later. In other words, artificial wins can disappear permanently.

The practical takeaway

For most marketers, this update is not a call to action so much as a health check. If your SEO approach is grounded in useful content, clear site structure, good user experience and trustworthy promotion, there is little reason to overreact.

If, however, parts of a site rely on thin content, scaled landing pages, aggressive link tactics or other questionable shortcuts, now is a good time to review them before performance becomes a bigger issue.

The March 2026 spam update was quick, but its message is familiar. Google is still getting better at identifying tactics that try to manipulate search rather than serve users. For marketers, that is another reminder that sustainable SEO is not about tricks. It is about building something worth finding.

Technical SEO

Where to Focus Your Technical SEO with Limited Resources

March 20, 2026 Posted by Matthew Widdop Round-Up 0 thoughts on “Where to Focus Your Technical SEO with Limited Resources”

One of the biggest wins in SEO to boost performance is making sure your site is optimised from a technical SEO perspective. This is essential for improving the user experience, fixing broken links, improving site speed and optimising titles, making your website easier to navigate and improving its value to search engines. However, when running a technical SEO audit, especially for the first time, there can be an overwhelming amount of tasks, making it hard to know where to start.

You may also find that many issues that appear on a technical SEO audit are often developer-led tasks that an SEO may struggle to fix on their own. This can lead to incomplete technical performance optimisation or many developers being tasked with focusing their efforts on fixing issues of little importance or relevance to real-time performance. Bruce Clay speaks about this in his latest article for SearchEngineLand,

“One of the biggest hurdles for in-house SEO programs is the lack of resources to implement changes to the website.

  • Up to 67% of respondents cite non-SEO dev tasks as the biggest reason technical SEO changes can’t be made, according to Aira’s State of Technical SEO Report.”

Page Speed and User Experience

One of the first places to focus when improving technical performance is page speed. Website conversion rate and bounce rate improve dramatically when page speed increases; this is a strong signal for Google to rank your content higher. Google has a free tool, Page Speed Insights, that will let you run a URL and check the page speed, where it will feed back key metrics and insights on how to improve performance.

Running a speed optimisation plugin on your site, such as WPRocket, will allow you to address page speed issues, including; delaying JavaScript execution, reducing unused JavaScript and CSS, adding Lazyload to images and videos and more. Installing one of these plugins is a quick win to improve page speed, but make sure you are aware that these plugins can often conflict with other plugins or cause issues if certain settings are turned on. Make sure to consult your web developer, and if any issues arise, you may have to look into different plugins or code-based fixes to improve page speed and user experience.

Crawling and Indexing

Crawling and indexing are the main ways your site gains traffic. Without being indexed in search engines such as Google, it becomes incredibly difficult for users to reach your site. Google uses website crawlers to crawl over the pages on your site, understand your content and index it appropriately.

When looking at prioritising technical tasks, making sure your main pages are indexed appropriately is one of the most important factors. You can use an SEO technical audit tool on platforms such as SERanking or SEMRush to see if anything is blocking your pages from being indexed. However, it is important to check Google search console, because even though there may be nothing blocking your page from indexation, it still does not appear in search engines if Google considers it thin or duplicate content. Google Search Console allows you to see which of your pages are being indexed by Google and the reason if they are not appearing. If any important pages on your site are being blocked from Google search, you need to address this.

Keeping on top of both your site speed performance, as well as any crawling and indexing issues, will allow you to maintain solid rankings while creating new and engaging high-quality content, which will produce improved rankings over time.

Latest Posts

Categories