SEO Pages with Parameters

benperry
Visitor
2 0 0

Hello,

 

A well-known problem on Shopify, the

?pr_prod_strat=collection_fallback&pr_rec_id

parameters create a bunch of URLs that are uninteresting for Google.

 

In addition to consuming crawl budget for nothing (thanks Shopify) this can cause serious problems:
- Google not choosing the same canonical URL as declared
- Google indexing the same two pages with different URL parameters, creating duplicate content...

 

This article got a lot of attention: https://ed.codes/blog/weird-shopify-seo-issue-millions-of-indexed-pages
Which recommends setting all URLs containing parameters to disallow for crawling.


However, since these URLs (with parameters) are the ones in the recommendatios etc sections, they are the ones responsible for internal linking.

recommended-product-links.e4839b68_Z1G5iJp.pngSo if I disallow them, Google will no longer take internal linking into account for all the pages on my site?
Or will it return the internal linking juice to the declared canonical URLs?

 

A priori, if it's in disallow, it doesn't crawl the page, so it doesn't even see the declared canonical URL...
At that point, what's the solution to avoid consuming unnecessary crawl budget (millions of parameter combinations) and also avoid indexing the same two pages?

 

I've seen this solution passed around the Shopify community: https://community.shopify.com/c/shopify-discussions/removing-url-parameters-recommended-products-sec...
It would solve the problem vis-a-vis crawlers... but it would prevent Shopify tracking apparently?

 

Very simply, what I'm looking for is:
1) disallow URLs with parameters (to save crawl budget)
2) while keeping internal linking for crawlers

Reply 1 (1)
benperry
Visitor
2 0 0

N/A