SEO, AdWords, affiliates, advertising, and promotions
Hey everyone,
We recently got a pretty strange error - robots.txt has appeared stopped blocking pages nearly entirely for a short time period.
On 6/19 it was blocking 5,758 pages of our site.
On 6/20 it was only blocking 2.
On 7/1 it went back up to 6,389
Proof is attached.
Robots.txt has never been modified and seeing it not working is extremely concerning given how important Shopify documentation says it is.
Incorrect use of the feature [robots.txt editing] can result in loss of all traffic.
Has anyone else seen a similar drop? It's under Google Search Console > Indexing > Pages > Blocked by robots.txt
Thanks!
What URLs are being blocked? It says so underneath. That's what matters.
Shopify is having problems in the past few months with /collections/vendor spam that we have little control over. These can be blocked in the robots.txt file.
The URLs being blocked are the proper ones, there's no issue with what Shopify is blocking now or what they were blocking in the past. The problem is that for ten days they weren't blocking anything, at least according to Google Search Console.
Shopify has elsewhere told me they had absolutely nothing to do with robots.txt failing even though it's a file they're in complete control of. Since posting I've had two store owners come to me showing the same drop in blocking via robots.txt during those dates.
Starting a B2B store is a big undertaking that requires careful planning and execution. W...
By JasonH Sep 23, 2024By investing 30 minutes of your time, you can unlock the potential for increased sales,...
By Jacqui Sep 11, 2024We appreciate the diverse ways you participate in and engage with the Shopify Communi...
By JasonH Sep 9, 2024