Shopify has a new bug - they are indexing tons of useless pages in Google again. Is there any way to stop this from happening? The code seems to only be editable by shopify since it’s in the content_for_header.
What you said has nothing to do with the problem. These pages need deindexed, not blocked from crawling which editing a robots.txt file would do. Also, this has nothing to do with backlinks, so there is nothing to disavow. Thanks, but not helpful.
Pretty amazing how agressive Googlebot has been with indexing these pages. In 1 week, Peets.com has grown from 10 to 374. Total indexed went from ~1million up to 10 million as of tonight.
I contacted shopify, so they are aware of this. To fix, they will need to have this script ONLY serve up the javascript instead of now where it serves up javascript+ , , and tags. Additionally, to deindex all the existing pages, they will need to move this javascript file to a new url and ideally 410 error the old url format so it gets out of the index as quickly as possible, but could 301 redirect it to the new url.
Seeing the same here in terms of huge new levels of indexation.
I also contacted Shopify to inform them this is happening to basically all merchants on the Shopify platform.
I advised that this could practically destroy some store’s SEO and organic rankings if it’s not fixed quickly.
Unfortunately support doesn’t really seem to understand the issue and/or doesn’t seem to be very fast acting on this. There is also no way for them to put me in touch with an SEO specialist from Shopify to explain the bug they have created directly to them. All have to go via general support which dont really understand the urgency.
The same here, I contacted them and I insisted on the fact that it is a real urgency to fix ASAP, but it seems they don’t understand and don’t really try to understand the issue.
This issue impacting the indexing results of Web Pixel Sandbox on Google has now been resolved on our end. As Google controls their own indexing, it may take some time to see the result.
If you continue to experience any issues, please try clearing your cache, or let us know by posting a new topic with as much detail as possible.
Shopify - it’s pretty ■■■■■■ that you marked this as “resolved”. I DID NOT accept your solution.
More importantly, the way you “fixed” this issue is not truly a fix. You simply added “noindex, nofollow” to the pages but did not fix the root reason why Googlebot even is spending any time crawling these urls in the first place. Because of this, Googlebot will continue to waste our crawl budget on these urls - whether your noindex/nofollow directive “fix” is there or not. As I mentioned above, you need to remove the body, html, and head tags, 404/410 the old url format and use a new one.
Hi! Google last crawled my site on Jan 29. So after you claim to have fixed it, and that was the first time I got the issue. Can you explain what cache it is that I should clear?
I’m getting just a blank page when I open my [email removed] listings in Google. Is that what Shopify intends to happen now with the fix in place? @gregbernhardt mentioned in another thread that Shopify fixed it so the pixel pages are now throwing 404s.
Clearing the cache is not an acceptable solution, the code still exists within the site and now Google Ads is flagging it as malicious code and blocking ads.
Can you please provide a better solution like a rewritten code for pixels manager and a how to insert in place of the original.