Discuss and resolve questions on Liquid, JavaScript, themes, sales channels, and site speed enhancements.
Shopify has a new bug - they are indexing tons of useless pages in Google again. Is there any way to stop this from happening? The code seems to only be editable by shopify since it's in the content_for_header.
Do a google search for:
inurl:/web-pixels-manager@
(https://www.google.com/search?q=inurl%3A%2Fweb-pixels-manager%40)
Each site has multiple versions of this page indexed. Peets.com for example has 20.
The pages look blank, but include something like this in the code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Web Pixels Manager Sandbox</title>
<script id="web-pixels-manager-setup">(function e(e,n,a,o,t){e&&(window.Shopify=window.Shopify||{},window.Shopify.analytics=window.Shopify.analytics||{},window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=function(e,n,a){window.Shopify.analytics.replayQueue.push([e,n,a])});var r,i,s,l,d,c,p,u,f=a+"/"+o+"."+function(){var e="legacy",n="unknown",a=null,o=navigator.userAgent.match(/(Firefox|Chrome)\/(\d+)/i),t=navigator.userAgent.match(/(Edg)\/(\d+)/i),r=navigator.userAgent.match(/(Version)\/(\d+)(.+)(Safari)\/(\d+)/i);r?(n="safari",a=parseInt(r[2],10)):t?(n="edge",a=parseInt(t[2],10)):o&&(n=o[1].toLocaleLowerCase(),a=parseInt(o[2],10));var i={chrome:60,firefox:55,safari:11,edge:80}[n];return void 0!==i&&null!==a&&i<=a&&(e="modern"),e}()+".js";r={src:f,async:!0,onload:function(){if(e){var a=window.webPixelsManager.init(e);n(a),window.Shopify.analytics.replayQueue.forEach((function(e){a.publishCustomEvent(e[0],e[1],e[2])})),window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=a.publishCustomEvent}},onerror:function(){var n=(e.storefrontBaseUrl?e.storefrontBaseUrl.replace(/\/$/,""):self.location.origin)+"/.well-known/shopify/monorail/unstable/produce_batch",a=JSON.stringify({metadata:{event_sent_at_ms:(new Date).getTime()},events:[{schema_id:"web_pixels_manager_load/2.0",payload:{version:t||"latest",page_url:self.location.href,status:"failed",error_msg:f+" has failed to load"},metadata:{event_created_at_ms:(new Date).getTime()}}]});try{if(self.navigator.sendBeacon.bind(self.navigator)(n,a))return!0}catch(e){}const o=new XMLHttpRequest;try{return o.open("POST",n,!0),o.setRequestHeader("Content-Type","text/plain"),o.send(a),!0}catch(e){console&&console.warn&&console.warn("[Web Pixels Manager] Got an unhandled error while logging a load error.")}return!1}},i=document.createElement("script"),s=r.src,l=r.async||!0,d=r.onload,c=r.onerror,p=document.head,u=document.body,i.async=l,i.src=s,d&&i.addEventListener("load",d),c&&i.addEventListener("error",c),p?p.appendChild(i):u?u.appendChild(i):console.error("Did not find a head or body element to append the script")})(null,null,"https://cdn.shopify.com/shopifycloud/web-pixels-manager/0.0.186","sandbox","0.0.186");</script>
</head>
<body></body>
</html>
Solved! Go to the solution
This is an accepted solution.
Hello everyone.
Thank you for your patience while we worked towards providing a fix to this issue and building out an FAQ that addresses the primary concerns and questions from this thread. You can find our FAQ on the Community Blog.
We welcome you to continue the conversation in the blog section of the blog post with any valid questions that aren't already answered by the FAQ. We will monitor the comments section for any valid feedback on how this change may impact Shopify stores and will actively remove or edit any comments that spread misinformation or speculation on the issue.
On that, this thread contains some misinformation and speculation on the issue and how it may impact ones online store. For this reason, we will no longer be monitoring this thread, but want to keep it open and available for historical purposes.
We greatly appreciate all of you for bringing this to our awareness and collaborating with us while we worked towards a solution. @Greg-Bernhardt deserves a special shoutout as they have championed this thread and this issue internally.
I'll be marking this as the solution as this has been resolved and helps surface this reply for anyone who may be new to the thread.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
This is an accepted solution.
Hello,
Thank you for your continued feedback on the web pixels issue. More recent replies have been sharing misinformation or misattribute the issue. For this reason, we've chosen to close this thread.
If you believe you have a new issue that isn't answered by our FAQ, then we'd encourage you to create a new thread in our Technical Q&A board with as much detail as possible.
Thank you.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
@Greg-Bernhardt wrote:Our goal is for the document to answer all the common questions we're seeing here.
I'd love to hear the actions shopify is taking to actively get these 404's removed from these domains, can I expect to see that in the document?
We're already experiencing a legitimate disruption in our performance but I appreciate the response. Our focus is on getting this resolved. We will be awaiting the support document and instructions on how to proceed.
Hi All,
We've spent the time to put together documentation on this issue, including the origin, the evolution, and the solution.
Definitive SEO Guide to Shopify's Web Pixel Manager Problem
The fix we implemented - initially posted here - has kept these pages out of the index and our rankings have been steady. The fix today, is different than it was then.
I'd be happy to answer specific questions and add them to the pages Frequently Asked Questions.
I found some relevant reading, from a page titled "The Worst SEO Nightmares and How To End Them". The speaker in this context is named Paul Shapiro, and I'll be quoting him below. It's a 5 hour SEMRush stream, but quotes are available here: Source
Shopify should have taken that advice, if only they had this dude on their payroll, that would have been great huh?
Those who have experienced their GSC 404 and noindex reports being flooded with pixel URLs, please DM me a screenshot of these reports in GSC showing the URLs and dates. Don't crop out the report name, I need to see what report the screenshots are. Also copy and paste a few URLs as text links in the message that I can visit. Thanks!!
To learn more visit the Shopify Help Center or the Community Blog.
@Greg-Bernhardt what are we supposed to do? Wait for you to fix this at Shopify or am I best employing an SEO expert to fix it on our end?
I don't want to have issues if the SEO expert fixes it and then you implement a fix which causes more problems with clashing fixes.
Can you give me a straight answer, do I:
1) Leave it and Shopify will fix this for all merchants across the board
OR
2) Employ someone to fix this issue on my website/theme alone
We can't leave it like this.
I was the 2nd person to report this issue in this post well back in 2022 , I can't believe that it's taking so long to not even get a fix but just get answers that make some kind of sense or some legitimate assurances other than a document will follow soon with no specific timeline mentioned.
My impression is going bottom and I still don't have any idea to fix it. Since middle of march, my organic traffic has significantly dropped. Please advise me if u guys have any idea to solve this.
No Index Tag: 700> 11,000 since Mid Mar
Canonical tag: 5000> 13000 since Mid Mar
@George_Greenhil you don't need to hire anyone. Please wait for our official document to be released very soon.
To learn more visit the Shopify Help Center or the Community Blog.
See how your site is doing and when/if this trouble started https://feinternational.com/website-penalty-indicator/
Hi Greg,
Can you explain why I'm getting new web-pixels-manager 404s?
As in, not from march. From April, as recently as the 13th.
This problem has not been solved. At all. The issues are increasingly arising. Your comment referencing the quote is misleading at best. While a single 404 may only take 1 or 2 days longer to resolve than a 401, we're dealing with tens of thousands of them. If Shopify caused us a couple hundred 404s nobody would be in this thread right now. It's caused thousands, and that's the problem.
Greg, can you PLEASE has Shopify prove they know what they're doing? Every single time the issue is "fixed" nothing happens, we just continue to receive 404 errors. Shopify has added 4,058 404s to the main site I work with. Every single action that shopify has taken so far has increased the number of 404s. I have three store owners wanting me to build sites on other platforms and I'd really rather stick to Shopify.
I need a date, a roadmap, or something. I need anything other than a dodged question or a misleading quote. If you can't provide an accurate answer then please escalate this.
To Store Owners:
As a side note to everyone else reading this while wondering how they're going to make up for their loss of sales, I've been hearing rumors of a class action coming so keep your eyes peeled.
Its enough we have to deal with this recession were in , no one is saying it out loud but the whole world is in a recession right now. People are not spending money like they used too so sales are already down for majority of business owners. Everyone is feeling the pinch and on top of that we have to deal with this nonsense!!!!
Thank you Kbarcant, well said.
Google is supposed to "figure out" that the pages are no longer endorsed by our website when they see that they're no longer linked to. Shopify could accelerate this process with 410's but they say it's "not worth their time".
Right now these pages are no different than any other non-existent page on your site, google only thinks they existed because of internal links from product pages - those links have been fixed. Now that the internal links have been removed google will no longer see those pages as pages you claim to be part of your website. "Eventually" they will go away. I advised shopify that returning a 410 error for those pages would fix the problem more quickly (per one of googles senior search devs), they said it wasn't worth the effort.
Maybe if they owned the shops that were effected by a problem they created they'd have more motivation to fix things.
@kbarcant wrote:Right now these pages are no different than any other non-existent page on your site, google only thinks they existed because of internal links from product pages - those links have been fixed. Now that the internal links have been removed google will no longer see those pages as pages you claim to be part of your website. "
Checked today. These links are being picked up from the iframe:
Yes, that is exactly the link that caused the indexation. So it is still internally linked, but not visable in the frontend. Google still sees those links. They are just set to noindex, so that Google won't index these pages. They are still available for Google and will be as long as they exist, Google will just crawl these URLs less often.
Were you able to remove that iframe and prevent further issues? I can't seem to find it on my site, and the only mention of wpm in the source code is all in scripts, no links.
@Greg-Bernhardt Once Google updated in GSC, the indexed pages will update. The no index tag on these wpm@ pages no effect for Google. Why Shopify could not stop the code to produce such pages???
@jackzhu please DM me one of those links marked as Mar 28
To learn more visit the Shopify Help Center or the Community Blog.
Hello Greg,
I have not posted in this thread previously, but we are experiencing the same issue.
We are having the same issue, in Google Search Console we have around 25 "web-pixels-manager@..." 404 not found errors, with Google last crawl dates ranging from March 11 - March 25.
I am unable to send you a DM to you (my account does not have a way to send one) and do not want to post the links publicly here. If you send me an email message to my email on record for this avatar, I would be happy to forward the URLS to you.
Thank you in advance for your help.
Best Regards,
Susan
Hi Greg, Just DM you one of those links. Pls check
1) Block Indexing of These Files on Your Site
In your Shopify, browse to Online Store -> Themes and click on the 3 dots beside your template and search for Edit Code.
In the theme editor, search for robots.txt.liquid and add to the bottom:
Disallow: /web-pixels-manager
2) Remove Indexed Pages from Google
Log into Google Webmasters (Search Console) and from the menu choose Indexing -> Removals.
Click the NEW REQUEST button.
Enter the following URL but replace example.com with your URL:
https://example.com/web-pixels-manager
Click "Remove all URLs with this prefix" and continue to submit the request.
^ Don't screw up the URL or you can end up deindexing all or portions of your site.
Hey @p1Commerce
Thanks for this - are we definitely okay to do this or might it conflict with something if Shopify makes any changes on their end? (even though it doesn't look like they're going to).
Thanks,
George
Customizing your robots.txt file won't cause any conflicts as your file would replace the default.
The second step in Google Webmasters should be done regardless and will have no impact on Shopify.
Hello,
Thank you for trying to help.
However, on my Edit code page there is no file starting with Robots.txt.liquid or Robots!
I am using the Dawn theme.
If your theme doesn't already contain the robots.txt.liquid template, then you can add it with the following steps:
Your directions were easy to follow - thanks so much!
Remember if you already have indexed pages, wait for them to drop out of the index before implementing the robots.txt rule or Google will not be able to see the noindex and your pages will remain indexed.
To learn more visit the Shopify Help Center or the Community Blog.
Step 2 forces the removal from Google so you don't have to wait.
That doesn't remove the pages from the index, only hides them from results temporarily, I think for 6 months.
To learn more visit the Shopify Help Center or the Community Blog.
When "temporarily" removing it from Google via step 2 is done in conjunction with robots.txt in step 1, it will remove it permanently, since Google has no way to re-add it to the index; I've done this hundreds of times and it works every time.
Hey Dededus. You will need to add the robots.txt file in order to customize it. Go to "Add Templates" and then select the Robots.TXT to add it.
Hi,
URL Removal
Submitted requests
It starts with: https://dededus.net/web-pixels-manager
Temporarily removing the URL
Status: Temporarily removed
looks like. I hope everything is fine.
Hi Dededus,
suggest checking and removing
site://dededus.net/wpm
About 3200 of your urls indexed in google
hello Splodge1,
Thank you for your interest. How can I see and control such problems?
By the way, I upgraded to Dawn 8.0. The robots.txt.liquid file was deleted, I downloaded it again as follows. The file is removed when you upgrade!
I requested to remove the prefixes from the Google search console.
Thank you for the warning.
# we use Shopify as our ecommerce platform
{%- comment -%}
# Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
{{- group.user_agent -}}
{% for rule in group.rules %}
{{- rule -}}
{% endfor %}
{%- if group.sitemap != blank -%}
{{ group.sitemap }}
{%- endif -%}
{% endfor %}
Disallow: /web-pixels-manager
Disallow: /wpm
Thanks so much for these fixes!
Thanks..
That's exactly how we tried to solve it in the beginning. The problem is, that Google indexes these URLs even though they're blocked by robots.txt. As we didn't find a way to add noindex/nofollow to this handle, that was the only way for us.
Now, that noindex/nofollow has been added by Shopify Team,
Disallow: /web-pixels-manager needs to be deleted from robots.txt as noindex can't be read by google otherwise.
@Denny Step 2) Remove Indexed Pages from Google above will remove the pages from Google indexing.
Yes, did that as well, because it was the only thing I could do. Google still lists them as "Indexed, although blocked by robots.txt file" and if i make a site: Request in Google, those URLs are still counted as indexed (even if they are not shown). Sending a request to Google will crawl them again, but as the robots.txt blocks the URLs, they are not going to be kicked out.
That's why I am not sure if Google really rates those pages as "not indexed" or if Google still values and counts them to part of your indexed pages, which i definitely want to avoid.
And that is the reason, why i chose the safe way with the new Shopify implementation of noindex, nofollow. I deleted the statement from the robots.txt, so that the nofollow statement can be read by Google. My experience is, that noindexed pages (without robots.txt) are not going to be crawled either, so it doesn't waste crawl budget. That's why I don't see any advantage of robots.txt over the new noindex tag.
@Denny10 I followed the two steps I posted above for a client and it took a few days for the index to update but it's showing removed:
If your pages are still showing as "Indexed although blocked by robots.txt", that to me does indicate they are in the index and that perhaps the removal tool wasn't used or wasn't used correctly. You seem very knowledgeable so not sure what the issue would be there.
The temporary removal tool should remove them per the screenshot above.
After they are removed from the index "temporarily", either a noindex directive as you're suggesting, or a block in robots.txt as I've suggested above, would do the trick to keep them out of the index permanently.
Regarding crawl budget, Google checks the url pattern against robots.txt and avoids crawling all together if a disallow match is found, so robots.txt disallows do save crawl budget.
It was previously thought by many SEOs that noindex pages did use crawl budget because pages needed to be crawled in order to read the noindex directive, but John Mueller and Gary Illyes have stated that there's no amount of noindex pages that will negatively impact a websites crawl budget (Reference: https://www.searchenginejournal.com/google-noindexed-pages-do-not-impact-crawl-budget/472870/).
Either way then, should do the trick.
I have a few clients that have millions of pages so I do prefer robots.txt as I feel like it makes it faster for Google to crawl since it can avoid reading bad pages all together and I like to do everything I can to make Google's job easier.
I also haven't seen any native functionality in Shopify that makes noindexing these types of pages easy.
Thank you for you help.
You are so much clear then Shopify itself.
Did anyone try contacting google and asking them what to do about this issue? That's what shopify told me to do.
Good question Shaindel...
For our site, the "Web Pixel" pages are no longer being indexed by Google. However, each week we're seeing a substantial increase in the number "Excluded by ‘noindex’ tag" pages.
Does this effect SEO in any meaningful way or is it just a nuisance? Clearly it would be a more complete solution if these pages were not generated in the first place.
@Italia-Straps John Mueller previously stated that the number of noindexed page is not a concern:
https://www.searchenginejournal.com/google-noindexed-pages-do-not-impact-crawl-budget/472870/
That said, if these pages are no longer indexed you can prevent these in Search Console by blocking via robots.txt
I'd agree, tons of noindexed pages are no problem at all.
Guess there is nothing more you could do than wait until all the pages are crawled again, so that they'll be deindexed. For me, there are no more pages that are being indexed since the fix 🙂 Seems like most of you are experiencing the same.
Crawling Logs also look pretty normal, so Crawling Budget isn't a problem for our shop.
As 2024 wraps up, the dropshipping landscape is already shifting towards 2025's trends....
By JasonH Nov 27, 2024Hey Community! It’s time to share some appreciation and celebrate what we have accomplis...
By JasonH Nov 14, 2024In today’s interview, we sat down with @BSS-Commerce to discuss practical strategies f...
By JasonH Nov 13, 2024