As everyone already knows, Google modified the search results parameter last month, in true Google fashion—quietly and discreetly. It was kind of hard to miss because it wreaked havoc on SEO dashboards everywhere. A major drop in impressions in Google Search Console, tracking data lost after the first page in third-party rank tracker dashboards, and ranking terms misplaced are just a few of the fallen dominoes from this seemingly innocuous update.
If you’ve been tracking rankings or monitoring Google Search Console data and noticed something peculiar around mid-September 2025, you’re not alone. This change has fundamentally altered how SEO tools collect data and triggered a debate about the efficacy of the metrics we’ve been relying on.
What is the &num=100 parameter?
The &num=100 parameter is a URL parameter that many SEO tools use to observe and track the top 100 organic results on SERPs. It allows tools to check site rankings and impressions on the second page and beyond, and enables more efficient and reliable SERP scraping by requiring fewer queries. However, since this development, the limit to acquiring results is only the top 20 (or fewer), reducing SERP visibility for analytics.
While the change hasn’t directly affected rankings or traffic, it has altered how we report and collect data. Rank-tracking tools like Ahrefs and Semrush needed to make multiple smaller requests to gather the same dataset, which took longer and cost more. Many SEOs have also noticed fluctuations in Google Search Console — including drops in impression counts and shifts in average position. These changes don’t necessarily indicate a loss in visibility; rather, they reflect the fact that impressions beyond positions 10 or 20 are no longer included in the data.
Some industry professionals have marked mid-September 2025 as a clear “data break” in their dashboards to maintain reporting accuracy when comparing historical trends. The broader takeaway has been that the focus for the time being lies squarely on rich results, rather than investing time and resources in tracking rankings buried deeper in the SERPs.
On September 12, 2025, Google removed this functionality entirely without announcing the change through official channels, but it didn’t go unnoticed. A week after the rollout, Google finally responded with a statement: “The use of this URL parameter is not something that we formally support.”
The Timeline of Events
If you’ve felt like the past year in search has been a bit of a rollercoaster, you’re not alone. Just as SEOs were trying to make sense of why impressions and clicks started moving in opposite directions, Google threw us another curveball. To really understand what happened, let’s rewind a bit.
2024: The AI Overviews Era Begins
When Google launched AI Overviews in 2024, SEOs immediately noticed something weird. Impressions in Search Console were climbing, but clicks were falling.
 These two metrics had always moved hand in hand, so the sudden split didn’t make much sense. Many assumed it was a side effect of AI Overviews — after all, if Google was answering more questions right on the results page, users wouldn’t need to click through as often.
Early 2025: The Great Decoupling
By early 2025, that trend only got stronger. Impressions kept going up, while click-through rates continued to drop. The SEO community dubbed it “The Great Decoupling” — a sign, they thought, that AI was changing user behavior and reshaping the traditional search funnel.
September 12, 2025: The Parameter Dies
Then came another twist. On September 12, Google quietly dropped support for the &num=100 parameter, which allowed people (and SEO tools) to view 100 search results at a time. Within days, impressions in Search Console plummeted. Some sites saw desktop impressions drop by over 200,000 per day, and keyword rankings beyond the top 20 appeared to vanish.
The Plot Twist
That’s when SEO consultant Brodie Clark stepped in with a fresh perspective. He noticed that the sharp drop in impressions lined up perfectly with when rank-tracking tools started breaking. His takeaway? Maybe “The Great Decoupling” wasn’t just about AI Overviews after all — it might have been inflated by bot traffic from SEO tools scraping Google at a massive scale.
What is The Bot Traffic Theory?
Here’s where things get uncomfortable for the SEO industry. Research following the parameter removal revealed shocking statistics:
- 87.7% of GSC properties experienced impression drops after September 12 (Tyler Gargula, LOCOMOTIVE analysis of 319 websites)
 - ~15% median impression decline across the board, with large websites hit hardest (Paul Grillet, ThotSEO analysis of 1,334 websites)
 - 25% aggregate drop in impressions during the week following the change (Serge Bezborodov, JetOctopus analysis of 1,000 websites)
 - Average position declined significantly for positions 21+ immediately after the modification (Matthew Mellinger, 100+ website dataset)
 
A new theory is starting to make sense of it all, and it’s not what most people expected. As AI-powered search tools exploded in popularity and rank-tracking companies tried to keep up, Google may have been hit with a wave of automated scraping. Data from AI licensing startup Tollbit backs this up: at the start of 2025, only about 1 in every 200 visits came from AI bots. By mid-year, that number had jumped to 1 in 50 — a massive surge that likely caught Google’s systems off guard.
These bots generated impressions in Search Console whenever they queried Google, artificially inflating impression counts while human traffic remained flat or declined. The &num=100 parameter made this scraping efficient. Remove it, and suddenly those bot-driven impressions disappeared from GSC data.
How Did Popular SEO Tools Respond to the Google Search Parameter Modification?
The Google search parameter modification caught the industry off guard. These are the initial responses recorded from various tools. Many of them have already adjusted their systems and found reliable workarounds, so the data you see in most dashboards is accurate again. Here’s how major players adapted:
SEMrush reassured users that the top 10 and top 20 results, which power most actionable insights (such as Visibility and Share of Voice), remained fully intact. They emphasized their infrastructure was built to handle the change, though costs would inevitably increase.
Ahrefs acknowledged the disruption and began working on solutions to maintain data quality, even though it took 10 requests to gather what one had previously delivered.
Sistrix initially couldn’t publish daily Visibility Index values and had to suspend some project keyword compilations while re-evaluating data collection depth and frequency.
Keyword Insights and others scrambled to adjust their scraping methodologies.
SerpApi eventually discovered a workaround using the “Google Fast Light API,” allowing 100 results per request again—though at what cost and with what scalability remains to be seen.
Why Did Google Really Do This?
Several theories circulate about Google’s motivation for turning off the parameter:
Theory 1: Curbing Bot Abuse
The most plausible explanation is that SEO tools and AI scrapers were hammering Google’s infrastructure at unsustainable rates. ChatGPT’s reported use of SerpApi for search data may have been the final straw.
Theory 2: AI Overviews Damage Control
Google may want to obscure the actual impact of AI Overviews on click-through rates by muddying historical data comparisons. Without clean pre- and post-data, measuring AI Overview cannibalization becomes more difficult.
Theory 3: Monetization
By making large-scale data collection more expensive, Google indirectly pressures tools and agencies to use its official APIs and paid solutions.
Whatever the reason, the message is clear: the era of frictionless Google scraping is over.
Also read: What is Generative Engine Optimization?
How to Adapt Your SEO Strategy to the Change?
Is the removal of the &num=100 parameter the end of rank tracking or a nudge in a new direction? It’s definitely the latter, and those who want to stay ahead of the curve will have to adapt to the current SEO trends. Here are a few things to consider as you navigate the new search landscape:
For Internal SEO Teams
- Mark September 12, 2025, as a baseline reset in all analytics dashboards.
 - Focus keyword tracking on positions 1-30 where ROI is highest.
 - Reconsider the GSC impression data pre-September 2025.
 - Prioritize conversion tracking over vanity metrics.
 
As an Agency
- Educate clients about the data discontinuity to address declining impressions.
 - Adjust reporting to emphasize clicks, traffic, and conversions.
 - Consider tools that have successfully adapted to the new paradigm.
 - Build strategies around rich results and SERP features.
 
Tool Selection Criteria
When evaluating rank tracking tools post-modification, ask:
- How deep do they track by default (top 20, 50, or 100)?
 - What’s their data collection frequency?
 - Have they implemented workarounds for the parameter removal?
 - How transparent are they about data limitations?
 
Perhaps the biggest revelation from this entire episode is that Google Search Console data isn’t the “source of truth” for SEO professionals and is more polluted than we realized. Bot impressions from legitimate SEO tools were inflating metrics across the industry, creating an illusion of visibility.
This doesn’t mean GSC is useless. Clicks remain reliable, and the tool still offers invaluable insights into actual user behavior. But impression data requires more nuanced interpretation, especially when comparing historical trends across the September 2025 divide.
Also read: How to Rank a Blog on the First Page of Google
Is this the Dawn of a New Era of SEO Measurement?
The modification to the Google search parameter represents a paradigm shift in how we measure and understand search visibility. For SEO professionals, the way forward is clear. Focus on page-one results, prioritize user-centric metrics like clicks and conversions, and ditch vanity metrics.
The tools and strategies that thrive in this new landscape will have to align with Google’s new modus operandi – catering to humans, not bots.
As we close out 2025 and look toward 2026, it’s pretty evident that the SEO industry’s adaptability will be tested consistently. Those who stay focused on creating valuable content and earning legitimate visibility will power through.
The &num=100 parameter is gone, but SEO isn’t. If anything, this modification is forcing us to become better, more competent practitioners focused on outcomes rather than optics. And that might be the best outcome of all.

        

