kbarcant
Shopify Partner
Contact Me
Status
Offline
Last Activity
‎08-20-2023 10:22 AM
Topic Started
1
Topics Started
Topic Solutions
0
Solutions
Post Count
50
Posts
Post Kudos
80
Likes

Community Badges

3/14 - Google indexed 3,000+ pages on one particular site I'm working with. 3/15 Traffic reduced from ~1000 to ~750 overnight 3/16 - Traffic reduced from ~750 to ~500. This was addressed i...
No content to show

User Activity

Hopefully this helps. I wrote this up this morning for all of the store owners like my clients who just need to understand what's going on. I did my best to put it in plain English since Shopify is directed at those who aren't necessary professional ...
We went from <10 404s to nearly 10,000, along with literally about 10,000 other errors. We also went from 1000 daily users to 250. We've been repeatedly told it's unrelated by Shopify and that no additional fixes are being rolled out as there's no is...
The URLs being blocked are the proper ones, there's no issue with what Shopify is blocking now or what they were blocking in the past. The problem is that for ten days they weren't blocking anything, at least according to Google Search Console. Shopi...
Hey everyone,We recently got a pretty strange error - robots.txt has appeared stopped blocking pages nearly entirely for a short time period.On 6/19 it was blocking 5,758 pages of our site.On 6/20 it was only blocking 2.On 7/1 it went back up to 6,38...
I'm fairly certain this all stems from the mass robots.txt deletion and is actually hardly related to the issues that caused the WPM errors previously. No robots.txt = no robots.txt blocking = new errors When robots.txt went down we had 5,000 new pag...
From what I can see on my side it looks like websites were served to google without robots.txt present (or at least usable) for a few days. You can verify that this is why you're affected by going to your "Blocked by robots.txt" section of errors and...
Google detecting 4,200 new 404 pages is not expected. The issue isn't that they're a 404, the issue is that Shopify is providing links directly to 404 pages. These are new pages. Please stop making them.
UPDATE:Looks like robots.txt was either offline or served a nearly blank file for a few days. Glad to see that there's absolute turmoil on the back end. "Incorrect use of the [robots.txt] feature can result in loss of all traffic."-Shopify 
Did your "Blocked by robots.txt" drop to zero?
It looks like robots.txt was blocking nothing for a few days. Are you seeing something similar?  
Hi Vicky,Your steps seem perfect for your situation. Our "Indexed, though blocked by robots.txt" has also gone to nearly zero which is as expected. 
I'm not holding my breath, my 404 count just doubled. We didn't even get that many new 404s on 3/13-3/15. Today is, on paper, worse than the worst day of this problem to date. Time to sit back and watch the traffic drop coincidentally and completely ...
Now I don't feel like I'm going crazy, thank you. This kills me because we've been fighting 404s forever now with very little progress and now we have 1.5x the amount of 404's we had when this issue started.Pre WPM we had <10 404s, WPM wave 1 got us ...
From 7-1 to 7-3 we got 4,200 new web-pixels-manager 404s on a website with <300 products. These newly discovered 404s are listing the referring URLs as their corresponding real pages, so the error is generated by Shopify. Why is Search Console tellin...
Bad news, something might have just changed for the worst. Since Shopify fixed the discovery issue (script in a script) we haven't seen any new errors. On July 1st, Google found 4,200 new WPM 404s on the website. Per Search Console: Google is current...
This widget could not be displayed.
This widget could not be displayed.