Discuss and resolve questions on Liquid, JavaScript, themes, sales channels, and site speed enhancements.
Shopify has a new bug - they are indexing tons of useless pages in Google again. Is there any way to stop this from happening? The code seems to only be editable by shopify since it's in the content_for_header.
Do a google search for:
inurl:/web-pixels-manager@
(https://www.google.com/search?q=inurl%3A%2Fweb-pixels-manager%40)
Each site has multiple versions of this page indexed. Peets.com for example has 20.
The pages look blank, but include something like this in the code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Web Pixels Manager Sandbox</title>
<script id="web-pixels-manager-setup">(function e(e,n,a,o,t){e&&(window.Shopify=window.Shopify||{},window.Shopify.analytics=window.Shopify.analytics||{},window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=function(e,n,a){window.Shopify.analytics.replayQueue.push([e,n,a])});var r,i,s,l,d,c,p,u,f=a+"/"+o+"."+function(){var e="legacy",n="unknown",a=null,o=navigator.userAgent.match(/(Firefox|Chrome)\/(\d+)/i),t=navigator.userAgent.match(/(Edg)\/(\d+)/i),r=navigator.userAgent.match(/(Version)\/(\d+)(.+)(Safari)\/(\d+)/i);r?(n="safari",a=parseInt(r[2],10)):t?(n="edge",a=parseInt(t[2],10)):o&&(n=o[1].toLocaleLowerCase(),a=parseInt(o[2],10));var i={chrome:60,firefox:55,safari:11,edge:80}[n];return void 0!==i&&null!==a&&i<=a&&(e="modern"),e}()+".js";r={src:f,async:!0,onload:function(){if(e){var a=window.webPixelsManager.init(e);n(a),window.Shopify.analytics.replayQueue.forEach((function(e){a.publishCustomEvent(e[0],e[1],e[2])})),window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=a.publishCustomEvent}},onerror:function(){var n=(e.storefrontBaseUrl?e.storefrontBaseUrl.replace(/\/$/,""):self.location.origin)+"/.well-known/shopify/monorail/unstable/produce_batch",a=JSON.stringify({metadata:{event_sent_at_ms:(new Date).getTime()},events:[{schema_id:"web_pixels_manager_load/2.0",payload:{version:t||"latest",page_url:self.location.href,status:"failed",error_msg:f+" has failed to load"},metadata:{event_created_at_ms:(new Date).getTime()}}]});try{if(self.navigator.sendBeacon.bind(self.navigator)(n,a))return!0}catch(e){}const o=new XMLHttpRequest;try{return o.open("POST",n,!0),o.setRequestHeader("Content-Type","text/plain"),o.send(a),!0}catch(e){console&&console.warn&&console.warn("[Web Pixels Manager] Got an unhandled error while logging a load error.")}return!1}},i=document.createElement("script"),s=r.src,l=r.async||!0,d=r.onload,c=r.onerror,p=document.head,u=document.body,i.async=l,i.src=s,d&&i.addEventListener("load",d),c&&i.addEventListener("error",c),p?p.appendChild(i):u?u.appendChild(i):console.error("Did not find a head or body element to append the script")})(null,null,"https://cdn.shopify.com/shopifycloud/web-pixels-manager/0.0.186","sandbox","0.0.186");</script>
</head>
<body></body>
</html>
Solved! Go to the solution
This is an accepted solution.
Hello everyone.
Thank you for your patience while we worked towards providing a fix to this issue and building out an FAQ that addresses the primary concerns and questions from this thread. You can find our FAQ on the Community Blog.
We welcome you to continue the conversation in the blog section of the blog post with any valid questions that aren't already answered by the FAQ. We will monitor the comments section for any valid feedback on how this change may impact Shopify stores and will actively remove or edit any comments that spread misinformation or speculation on the issue.
On that, this thread contains some misinformation and speculation on the issue and how it may impact ones online store. For this reason, we will no longer be monitoring this thread, but want to keep it open and available for historical purposes.
We greatly appreciate all of you for bringing this to our awareness and collaborating with us while we worked towards a solution. @Greg-Bernhardt deserves a special shoutout as they have championed this thread and this issue internally.
I'll be marking this as the solution as this has been resolved and helps surface this reply for anyone who may be new to the thread.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
This is an accepted solution.
Hello,
Thank you for your continued feedback on the web pixels issue. More recent replies have been sharing misinformation or misattribute the issue. For this reason, we've chosen to close this thread.
If you believe you have a new issue that isn't answered by our FAQ, then we'd encourage you to create a new thread in our Technical Q&A board with as much detail as possible.
Thank you.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
THIS HAS NOT BEEN RESOLVED AT ALL!!
I have 100s of new 404s that look like this wpm@0.0.239@1bfdbe36waf26f8b1p2c0f542dm9f61a9b0/sandbox/it-ca/products/
as of today.
I have been following this thread and the previous thread that was marked as "solved". It has not been solved in the absolute slightest. I have redirects to try and get these errors to drop off. Stop coming backing with your absolute baffle of a solution when clearly there is nothing in place!!!!!!!!!!!!!!!!!!
Trevor - please reply to my post of 06-27-2023 10:53 AM .... this issue is not going away until you arrange for Google to clear our Google Search Consoles of all the junk.
Thanks in advance for your reply.
This problem of 404s wont go away until Shopify stops the pages from being 404s. They're still all 404s. I watched 2 sites go out of business and one lose 70% of its sales and be forced to downsize over these bugs. Expect no changes, it's clear we're on our own.
On top of that the vendors?q= is back for me?
When will this damn nightmare end!
This is an accepted solution.
Hello,
Thank you for your continued feedback on the web pixels issue. More recent replies have been sharing misinformation or misattribute the issue. For this reason, we've chosen to close this thread.
If you believe you have a new issue that isn't answered by our FAQ, then we'd encourage you to create a new thread in our Technical Q&A board with as much detail as possible.
Thank you.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
THIS HAS NOT BEEN RESOLVED AT ALL!!
I have 100s of new 404s that look like this wpm@0.0.239@1bfdbe36wa/products
as of today.
I have been following this thread and the previous thread that was marked as "solved". It has not been solved in the absolute slightest. I have redirects to try and get these errors to drop off. Stop coming backing with your absolute baffle of a solution when clearly there is nothing in place!!!!!!!!!!!!!!!!!!
I haven't seen any improvement on my end either, but it appears that everyone has been silent since last month. I predict that they have already resolved their issue.
Hey @jasonwill21
Mines is not resolved, I'm not being silent on purpose, I've just given up on trying after asking them to fix this for the past 6 months.
that's indeed a sad case. Additionally, some SEO experts told me that if we have a lot of 404 pages, it will draw our SEO traffic down. Here's what I have and 90% of them belongs to "WPM" links.
what can we all do now? i got the same problem, can we create a social media group to discuss this? I think more people out there need to know, or nothing will change!
I would be very interested in this.
I think best you make phone calls into Customer Support and escalate the call to their Management - no use posting stuff here .... just gets ignored, and makes us frustrated.
Bad news, something might have just changed for the worst. Since Shopify fixed the discovery issue (script in a script) we haven't seen any new errors. On July 1st, Google found 4,200 new WPM 404s on the website.
Per Search Console: Google is currently being referred to these 404 pages directly from the real pages they correspond to.
Here we go again! Hopefully this time the issue actually gets fixed.
Yep i just noticed that too, a jump in my 404s from 1777 to 3877 on july 1st also my no index count went up from 69k to 83k.
I also got lots of spam new vendors?q pages i thought that issue was fixed?
Will we ever a get a break this cursed year!
I'm not holding my breath, my 404 count just doubled. We didn't even get that many new 404s on 3/13-3/15. Today is, on paper, worse than the worst day of this problem to date. Time to sit back and watch the traffic drop coincidentally and completely unrelated to this.
I think this expected especially since many are seeing the same thing. We from the start communicated that the noindexes would over time convert to 404s when the WPM pixel URLs changed for versioning. This will not affect your performance. The 404s will clear over time. What we don't want to see is an increase in WPM noindex reports.
To learn more visit the Shopify Help Center or the Community Blog.
How about the spam pages vendors?q= issue i thought that was resolved and the sudden its back ?
From what I can see on my side it looks like websites were served to google without robots.txt present (or at least usable) for a few days. You can verify that this is why you're affected by going to your "Blocked by robots.txt" section of errors and looking for a couple day void where it flatlines. An error like this - the complete deletion of robots.txt - has the potential to completely wipe a site off the map and that's according to Shopify's own documentation.
To be clear, robots holds the tools that tell google hide everything Google shouldn't be serving. The blocked pages include theme previews, search results, checkout, admin, order status pages, sorted collections, and the customer account section. There are other methods used for hiding things but robots.txt is the front line worker and it was missing for days on end.
Google detecting 4,200 new 404 pages is not expected. The issue isn't that they're a 404, the issue is that Shopify is providing links directly to 404 pages. These are new pages. Please stop making them.
Can you please provide evidence for your claim that these 404s are being linked to?
To learn more visit the Shopify Help Center or the Community Blog.
Greg,
I have tried to keep calm and balanced re this topic - that has been going for 8 months or so.
My "Not found (404)" and "Crawled - currently not indexed" both doubled with the last GCS report:
I am tired of wasting hours trying different things to clean up my GSC .... NOTHING works. My ask that you get Google to do a reset of all the Shopify urls so that the buffer was cleared was ignored.
Now I am back to seeing lots and lots of junk in my GSC - again.
I really wish I hadn't migrated to Shopify (which I did in ~ Oct 2022 - talk about bad timing, as I had thought they would look after me as a small fish!).... and had stayed with another Provider / Wordpress.
Given Shopify has decided NOT to ask Google to clear the buffer, and we just have to wait (for how long ..... who knows, no number of months is provided .... it might be 10000 months for all we know).
I am soooooo annoyed. I will now actively let all my startups colleagues to NOT use Shopify.
We did inquire to Google about clearing GSC and they said they couldn't for technical reasons.
Does your "Crawled - currently not indexed" report contain WPM URLs?
To learn more visit the Shopify Help Center or the Community Blog.
How high did you escalate this up in Google ..... CEO to CEO? Of course Google doesn't "want" to do this - it is a negotiation!!!
Yes, there are about 10 WPM's and a few other nasties in there.
I am soooo tired of spending time on this issue.
I am just about to launch, and it looks like I need to find another provider and ditch Shopify ..... sheezzz. That is a lot of work for me.
Greg,
Have you considered that the problem is coming in through http:// (and not https://)
I ran the following reports on GSC Under "URL inspection", for http://www.XYZ.com and http://XYZ.com (as examples) and as you can see there are referrers who are not mine:
Why / who are these referrers? It being http:// perhaps hackers are exploiting it .... how about just shutting down http:// and just leave https:// operational.
Thoughts?
Greg ..... netorginfo looks like he is a hacker ..... it sends my Virus Protection alarms off - no one should type it in. The whoismind is a "search any IP Address" provider .... seems between the two of them they might be attacking us.
Greg - PLEASE look into this - I noticed this about 6 months ago, but thought that someone else would better understand if this is causing problems .... and didn't know who / how / what to make a report.
It looks like robots.txt was blocking nothing for a few days. Are you seeing something similar?
Yup …. spam back too
Now I don't feel like I'm going crazy, thank you. This kills me because we've been fighting 404s forever now with very little progress and now we have 1.5x the amount of 404's we had when this issue started.
Pre WPM we had <10 404s, WPM wave 1 got us to 4k, now we're closing in on 8k. It basically doubled, and had we not cleared out old product there's no doubt in my mind it would have actually doubled.
Did your "Blocked by robots.txt" drop to zero?
The devs looked into these dates and nothing was done on our end regarding robots.txt files.
To learn more visit the Shopify Help Center or the Community Blog.
Hi Vicky,
Your steps seem perfect for your situation. Our "Indexed, though blocked by robots.txt" has also gone to nearly zero which is as expected.
UPDATE:
Looks like robots.txt was either offline or served a nearly blank file for a few days. Glad to see that there's absolute turmoil on the back end.
"Incorrect use of the [robots.txt] feature can result in loss of all traffic."
We're also seeing the same thing reported by other stores since July 1st: Sizable increase in 404s and "Excluded by ‘noindex’ tag" pages. It's all the same wpm stuff and vendors spam as before. Also, can confirm the same weird data GSC "Blocked by robots.txt" data the above poster is reporting.
In general, there appeared to be period in June where the no index pages topped out and even began to drop slightly but as of July 1, they look to be on the rise again. 😞
Thanks
I'm fairly certain this all stems from the mass robots.txt deletion and is actually hardly related to the issues that caused the WPM errors previously.
No robots.txt = no robots.txt blocking = new errors
When robots.txt went down we had 5,000 new pages get added to "Crawled - currently not indexed" and 4,000 new 404 pages.
This is turning into a whitepaper I'll be calling "The Worst Technical SEO Nightmares and How To Make Them Worse" that I release when this is all over.
Hey Community! As the holiday season unfolds, we want to extend heartfelt thanks to a...
By JasonH Dec 6, 2024Dropshipping, a high-growth, $226 billion-dollar industry, remains a highly dynamic bus...
By JasonH Nov 27, 2024Hey Community! It’s time to share some appreciation and celebrate what we have accomplis...
By JasonH Nov 14, 2024