Why Is Google Adding SRSLTID To My URLs?
Sep 27, 2024 - Written by Joshua DeGrasse-Baumann - 8 min read
Over the last month or so, ecommerce sites have been dealing with an issue tied to a mysterious srsltid
parameter that is being appended to their URLs in search. But where did this parameter come from and what does it do?
The srsltid
parameter is typically seen in URLs when traffic is referred from Google Shopping Free Listing campaigns. It’s a tracking parameter that helps Google and advertisers understand the source of the click. Essentially, when a user clicks on a free product listing, the srsltid
parameter is appended to the URL to provide insights into where that click originated. This is functionally the same as a gclid
parameter that’s appended to ads. The key difference is… you don’t typically see the gclid
parameter on the search results page—which is what’s started to happen with srsltid
.
Now, that’s strange enough—but even more odd is that these parameters are being appended to many URLs for sites instead of being isolated to products in shopping feeds.
Given their new prominence, site owners and SEOs are asking… what are these parameters? And do they matter?
The answer is… complicated. Let’s explore this from a few angles.
Why Is Google Indexing srsltid
URLs?
The short answer is: it isn’t. Though it may seem like a semantic difference, the link Google is showing for organic results with srsltid
parameters does NOT reflect the “indexed” version of the page. It’s still the base URL of your page that isn “indexed”—Google’s just adding additional information to it.
As a quick example, I did a search for playing cards for sale
and then refreshed that search five times over the course of a minute. Here’s the “linked” URL for each of those searches:
- https://playingcarddecks.com/?srsltid=AfmBOoqSE-AmgwEVDycwF0wsagWXSLsxavTxtyDwwVOut_o8nVCl3zcC
- https://playingcarddecks.com/?srsltid=AfmBOopOoCGLJS4dFuzwwTxrqgJlvBtXIjely34tENXTJZEBLFKv3JTi
- https://playingcarddecks.com/?srsltid=AfmBOopcv_f60kUa-MJ3XxGt3Meg2Du5uQgipGWeKO14MIq2XQnH9Rqc
- https://playingcarddecks.com/?srsltid=AfmBOopftaFQ0-YbTp1I3SCnIOlrBJIkQyezkt7pWznbj48c9w-V1aqE
- https://playingcarddecks.com/?srsltid=AfmBOoryXGekZazEbi86fbUqW_eTUthZxeMutewuxs2ussVcX5Q77KTx
As you can see, the srsltid
parameter is slightly different every time. Google hasn’t indexed each of these versions—it’s just showing me the originally indexed URL and appending a unique tracking number each time I view a result.
This is functionally the same as someone clicking a Paid Search ad and receiving a new gcid
parameter—it’s not a “new” page, just a page with a tracking parameter on it.
Will Google Index srsltid
URLs?
I wouldn’t definitively say that Google won’t index these URLs.
There are all sorts of edge cases where indexing might actually happen, but they’d typically be caused by a secondary issue such as someone linking to a URL with the srsltid
parameter which is not properly canonicalized to it’s base URL. Google might find that linked URL and mistakenly index it as a “unique” URL—but for most sites, this type of indexing shouldn’t happen at a scale they’d need to be concerned about.
Recommended Action: Double check the canonical tags on your page. If they point to the base URL (without any parameters), you should be fine in the vast majority of cases. If the canonical tag does NOT point to the base URL—or doesn’t exist—then implement a proper canonical system
But Why Does srsltid
Appear In Search Console?
In this case, Search Console is showing you the clicked URL—but it’s still NOT the same as the indexed URL.
Now, Search Console is supposed to report all data to the Google-selected canonical URL but in some cases it just… doesn’t.
Is it great? Nope. But it’s mostly a reporting headache as opposed to an SEO issue.
Recommended Action: Honestly, not much you can do here, unfortunately. If you do reporting or analysis outside of Search Console, you can likely find a way to aggregate the unique
srsltid
parameters back into one URL.
Why Does srsltid
Appear in GA4, Semrush, or {Other Tool}?
For GA4, it’s reporting whatever URL is reported via the tracking code—which is most typically whatever appears in the address bar on page load. In this case, users clicking a result from search will get the srsltid
parameter appended to the URL, which is then reported to GA4.
For Semrush and other rank tracking tools, they’re just reporting the URL as seen when they checked the rankings. They don’t inherently know that srsltid
isn’t meant to be a unique URL. That said, some tools are starting to automatically filter this out of their reporting (at least Ahrefs seems to be doing that already).
Recommended Action: For GA4, you could make some modifications to how the page location is reported via the tracking code, but this requires some custom implementation either to the GA4 code itself or to the set up in Tag Manager, if you installed it that way. For other tracking tools, there’s not likely much you can do other than request they implement a solution on their end.
Does srsltid
Create Duplicate Content?
Similar to the indexing question, the answer here is mostly no. As long as you have a proper canonical system in place (which is a fairly basic SEO requirement these days), you shouldn’t have any issues with duplicate content.
While Google can ignore canonical tags, I’m yet to see any widespread issues of rampant content duplication issues across any of my clients—and nothing that suggests impacts to overall performance.
Recommended Action: As above, make sure your pages have proper canonical tags on them.
Does srsltid
Impact Crawl Budget?
Again, the answer here is probably not. Googlebot is not actively crawling these URLs from search, so it’s not trying to crawl them on your site.
Recommended Action: Do NOT add this parameter to your robots.txt file. This is a common recommendation I’ve seen from other people, but I STRONGLY disagree. The robots.txt file should mostly be reserved for links that exists for non-SEO reasons. And in this case, there typically aren’t links pointing to these URLs. And as a reminder, the robots.txt file should NEVER be used to block indexing.
Does srsltid
Impact Page Speed?
Now here we’ve got an interesting question. The srsltid
parameter does not inherently have a meaningful change on page speed—it would operate as any other URL with parameters in most cases. But where it CAN have an impact is on caching. If you’re using a system to show cached assets to users (and Google), the inclusion of a new parameter may force that caching to fail.
Think of your website’s caching system like a well-organized library where a URL is a book. When everything is in order, the librarian can quickly find any book you request. However, when a dynamic URL parameter like srsltid
is added, it’s like constantly rearranging the library shelves—each new parameter creates a unique version of the same book, making it harder to find quickly.
Most caching systems treat each unique URL as a separate request. This means that every time a parameter like srsltid
changes, the server has to generate the page from scratch instead of serving a cached version. This not only increases the page load time but can also negatively impact your site’s Core Web Vital metrics, like First Contentful Paint (FCP) and Largest Contentful Paint (LCP).
Recommended Action: Review your site’s caching configuration to ensure it can ignore specific parameters like
srsltid
. If not, consider implementing a solution that standardizes how these URLs are treated to maintain consistent caching behavior.
Is srsltid
a Major Concern?
For most sites, the srsltid
parameter is not going to significantly impact SEO performance. It should not be impacting rankings, indexation, or crawl budget.
That said, there are a few potential issues this parameter can cause:
- Reporting headaches
- Page speed issues (primarily a UX concern, but that can impact Core Web Vitals)
- MINOR indexation issues if proper canonical tags aren’t in place
Frequently Asked Questions
How can I tell if Google is indexing srsltid
URLs?
To check if Google is indexing these URLs, look at your Google Search Console under the “Coverage” and “Performance” reports. If you see parameterized URLs listed there, it means Google has discovered them. However, appearing in these reports does not necessarily mean they are fully indexed.
Should I block srsltid
URLs using robots.txt?
A: No. Blocking URLs via robots.txt DOES NOT prevent indexation—and can actually create problems with indexation in the future. A better approach is to use canonical tags to indicate the preferred version of the page to Google or consider a noindex tag where appropriate.
How do srsltid
parameters affect page speed and caching?
URL parameters can disrupt caching mechanisms because each unique URL is treated as a different page. This can result in slower page load times, which negatively impacts user experience and key metrics like FCP and LCP.