Discuss and resolve questions on Liquid, JavaScript, themes, sales channels, and site speed enhancements.
Shopify has a new bug - they are indexing tons of useless pages in Google again. Is there any way to stop this from happening? The code seems to only be editable by shopify since it's in the content_for_header.
Do a google search for:
inurl:/web-pixels-manager@
(https://www.google.com/search?q=inurl%3A%2Fweb-pixels-manager%40)
Each site has multiple versions of this page indexed. Peets.com for example has 20.
The pages look blank, but include something like this in the code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Web Pixels Manager Sandbox</title>
<script id="web-pixels-manager-setup">(function e(e,n,a,o,t){e&&(window.Shopify=window.Shopify||{},window.Shopify.analytics=window.Shopify.analytics||{},window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=function(e,n,a){window.Shopify.analytics.replayQueue.push([e,n,a])});var r,i,s,l,d,c,p,u,f=a+"/"+o+"."+function(){var e="legacy",n="unknown",a=null,o=navigator.userAgent.match(/(Firefox|Chrome)\/(\d+)/i),t=navigator.userAgent.match(/(Edg)\/(\d+)/i),r=navigator.userAgent.match(/(Version)\/(\d+)(.+)(Safari)\/(\d+)/i);r?(n="safari",a=parseInt(r[2],10)):t?(n="edge",a=parseInt(t[2],10)):o&&(n=o[1].toLocaleLowerCase(),a=parseInt(o[2],10));var i={chrome:60,firefox:55,safari:11,edge:80}[n];return void 0!==i&&null!==a&&i<=a&&(e="modern"),e}()+".js";r={src:f,async:!0,onload:function(){if(e){var a=window.webPixelsManager.init(e);n(a),window.Shopify.analytics.replayQueue.forEach((function(e){a.publishCustomEvent(e[0],e[1],e[2])})),window.Shopify.analytics.replayQueue=[],window.Shopify.analytics.publish=a.publishCustomEvent}},onerror:function(){var n=(e.storefrontBaseUrl?e.storefrontBaseUrl.replace(/\/$/,""):self.location.origin)+"/.well-known/shopify/monorail/unstable/produce_batch",a=JSON.stringify({metadata:{event_sent_at_ms:(new Date).getTime()},events:[{schema_id:"web_pixels_manager_load/2.0",payload:{version:t||"latest",page_url:self.location.href,status:"failed",error_msg:f+" has failed to load"},metadata:{event_created_at_ms:(new Date).getTime()}}]});try{if(self.navigator.sendBeacon.bind(self.navigator)(n,a))return!0}catch(e){}const o=new XMLHttpRequest;try{return o.open("POST",n,!0),o.setRequestHeader("Content-Type","text/plain"),o.send(a),!0}catch(e){console&&console.warn&&console.warn("[Web Pixels Manager] Got an unhandled error while logging a load error.")}return!1}},i=document.createElement("script"),s=r.src,l=r.async||!0,d=r.onload,c=r.onerror,p=document.head,u=document.body,i.async=l,i.src=s,d&&i.addEventListener("load",d),c&&i.addEventListener("error",c),p?p.appendChild(i):u?u.appendChild(i):console.error("Did not find a head or body element to append the script")})(null,null,"https://cdn.shopify.com/shopifycloud/web-pixels-manager/0.0.186","sandbox","0.0.186");</script>
</head>
<body></body>
</html>
Solved! Go to the solution
This is an accepted solution.
Hello everyone.
Thank you for your patience while we worked towards providing a fix to this issue and building out an FAQ that addresses the primary concerns and questions from this thread. You can find our FAQ on the Community Blog.
We welcome you to continue the conversation in the blog section of the blog post with any valid questions that aren't already answered by the FAQ. We will monitor the comments section for any valid feedback on how this change may impact Shopify stores and will actively remove or edit any comments that spread misinformation or speculation on the issue.
On that, this thread contains some misinformation and speculation on the issue and how it may impact ones online store. For this reason, we will no longer be monitoring this thread, but want to keep it open and available for historical purposes.
We greatly appreciate all of you for bringing this to our awareness and collaborating with us while we worked towards a solution. @Greg-Bernhardt deserves a special shoutout as they have championed this thread and this issue internally.
I'll be marking this as the solution as this has been resolved and helps surface this reply for anyone who may be new to the thread.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
This is an accepted solution.
Hello,
Thank you for your continued feedback on the web pixels issue. More recent replies have been sharing misinformation or misattribute the issue. For this reason, we've chosen to close this thread.
If you believe you have a new issue that isn't answered by our FAQ, then we'd encourage you to create a new thread in our Technical Q&A board with as much detail as possible.
Thank you.
Trevor | Community Moderator @ Shopify
- Was my reply helpful? Click Like to let me know!
- Was your question answered? Mark it as an Accepted Solution
- To learn more visit the Shopify Help Center or the Shopify Blog
My GOOGLE Rank has dropped significantly
Hey mchael it has affeected my SEO score
my DR rating went down by 3 points after google started detecting the errors.
Expereicning the same issue as you.
Did you manage to see a decrease/fix for your issue?
So Shopify dumped another 250 url's in to my Google search results this time with /wpm
I have 2 questions
- this has been going on since January - why is it not fixed?
and secondly what is the function of these pixels and how does it improve the performance of my website.
Answers from someone at shopify please
They've added a 'noindex,nofollow' X-Robots-Tag which is fine but if you have any of these URLs already indexed, you can request removal from the index in Google Search Console to speed up the process.
Today, while doing a site audit we noticed what appears to be a variation of the Web Pixels Manager bug described in this thread.
Instead of “web-pixels-manager@” in the urls, it’s now “wpm@”.
We discovered it using the “site:example.com” command in Google.
These pages are getting indexed since there isn’t a no-index command. Like before, the pages appear to be blank with no content.
Is anyone else out there seeing this? If so, hopefully Shopify can respond quickly with an no index solution like before.
Screenshot attached...
It does happen to me as well. But it's different to me.
robots.txt has blocked the crawling of those pages:
Disallow: /wpm@*
Also, noindex,nofollow is used.
Since the robots.txt doesn't allow crawling, noindex,nofollow can't be read by Google. And Google also just indexed one of those pages as well, their might follow more. The best way to deindex pages, that are indexed, even though their crawling should be blocked, is to remove the robots.txt statement and use noindex.
You can additionally request a removal (URLs starting with domain.com/wpm@) and can request a check of indexed pages, although blocked by robots.txt in the Search Console, after deleting the robots.txt statement.
Yes, there are two bugs now:
Use robots.txt.liquid template on shopify and enter:
Disallow: /web-pixels-manager@
Disallow: /wpm@
Shopify should fix this as it's becoming an ongoing issue.
My Google Search console is full of errors...
Why is this still going on , i thought shopify fixed this:
All the sudden my indexed page jump by 400 with this nonsense . As a business operator and owner i have tons of problems to deal with and i really don't need this.
SHOPIFY YOU NEED TO FIX THIS NOW
It was fixed, but shopify has changed the URL-pattern. The problem is, that they have blocked those URLs in robots.txt and Google does still index them, even though there are no backlinks.
You or Shopify need to delete the new disallow:wpm@* statement from robots.txt in order to fix it.
How does one delete these from the robots.txt? I don't even think I have access to that on my Shopify plan.
Same here, we're seeing an increasing number of 404 errors for urls with /wpm@ in Google Search Console.
Our Robots.txt has: Disallow: /wpm@* does this help or make the problem worse
We're seeing hundreds on Pages with /wpm@ urls indexed in Google.
Agree that the problem is not fixed!
The 404s of the old URLs are a good thing.
To learn more visit the Shopify Help Center or the Community Blog.
Hi Greg, Yes agree but we are seeing a large number (100's) of these wpm urls being indexed in Google. I've requested deletion in Google Search Console.
Yep and we're looking into that 🙂
To learn more visit the Shopify Help Center or the Community Blog.
Ok, Thanks for the Update, good to see its being worked on 🙂
We're seeing similar issues here…
A large spike overnight in indexed “@wpm” pages as well as a large spike of “@wpm” pages GSC identifies as “Indexed, though blocked by robots.txt”
We confirmed that our store’s Robots.txt has the Disallow: /wpm@* added recently by Shopify.
Thus, it’s confusing as to why the pages are not just Blocked by robots.txt
Hopefully a fix will be forthcoming from Shopify including a solution that halts generation of these blank @WPM pages in the first place.
@Italia-Straps that is exactly what we're investigating. Why the new URL is not being accurately blocked even with robots.txt
More when I have it
To learn more visit the Shopify Help Center or the Community Blog.
It's also unclear to me, why Google still indexes those pages. Usually Google does this, when there are Backlinks pointing to that page, but this is not the case. Right now, there is just internal link juice that's being passed.
With that given, robots.txt and noindex are existing on the same time, which doesn't work properly, as the noindex is not read because of the robots.txt block.
The way to fix it for most of the people is to make Google read noindex and delete the robots.txt.
Two things still unclear:
I'm facing a new problem now 😞
When I enter the link that comes out of the keywords I searched on google, it adds an extension starting with "?constraint= ....." into the existing link. When the screen pops up "Item not found"
Use fewer filters or remove all" screen appears.
Clicking "remove all" fixes it, but which client is dealing with this? 😞
https://dededus.net/collections/sanacryl-kuvet-ve-jakuziler?constraint=-ay-bebek-jakuzi
We're currently pushing a fix to remove the block in robots.txt so Google can see the noindex. This means Google will be crawling these pages again but they should not be indexed.
To learn more visit the Shopify Help Center or the Community Blog.
I hope its the same fix they did with web manager pixel few month ago. That one worked right away.
When you remove the block in the robot.txt i should remove the added code i did right?
When will the approximate date that the shopify engineers be able to fix this?
i discovered this issue today and the bug is really affecting my DR ratings as we're 100% reliant on SEO search
The bug is also very aggressive creating over 89,000+ URLs with 404s and causing other GSC issues too
March 19th update: We are seeing similar issues to other stores.
Another spike in indexed “@wpm” page in GSC
For our store, we can confirm that Shopify has removed the Disallow: /wpm@* in Robots.txt.
Some @WPM pages are now starting to appear in the “Excluded by ‘noindex’ tag” section in GSC. Hopefully that is good news.
Maybe Shopify or @Greg-Bernhardt can confirm that a platform level no index solution is in place?
Thanks
Try this tool to see how this is affecting your site.
Can confirm the same for our store as well - more /wpm@ URLs in the index (over 1500 now), disallow rule removed from the robots.txt, and those pages are marked with a no-index tag, so hopefully, everything gonna be ok soon. But on top of those /wpm@ URLs, all variations of the /cart URL are in the index now too -__-
There's also another bug that is probably somehow connected to this issue - a giant analytics discrepancy (unique visitors/sessions) that started at about the same timeframe as this bug. Do you see anything like that with your stores?
About 13 of my products have this page indexing error-Disallow: /wpm@* when crawled by Google. I guess we wait for Shopify to resolve.
what's the status of this?
in our case appears to be clearing out of Google console reports, but ranking has dropped considerably since December.
But the pages are blank for wpm@
indexed pages are spiking
The spikes shall stop the day the fix was implemented, but Google Search Console is usually lagging a few days behind. This is why it seems like more and more pages are indexed.
You can check if the fix is working. Go to yourdomain.com/robots.txt and check, if Disallow: /wpm@ exists or not. If it doesn't exist anymore, everything is correct.
You need to go into Search Console and go to Index -> Pages -> Indexed, although blocked by robots.txt file. You need to start a review on that page. You'll probably need several runs to clear all the pages, but this is the fastest way to get them out of the index. Alternatively, you can also wait for Google to crawl these pages again, but this might need some time.
Yes, it has been removed from robots.txt
But I'm not sure if the noindex tag has been implemented to get these fixed as these are getting indexed with every run and secondly, there is no 404 page appears
noindex is usually checked in Dev-Tools (F12). But in this case, noindex is added to X-Robots. You can use this website to check: https://indexcheckr.com/noindex-checker
It says
@ahsonmkhan can you post that URL?
To learn more visit the Shopify Help Center or the Community Blog.
@ahsonmkhan that is your homepage and you want that indexable. Do you have an example of a pixel URL that is reporting as indexable?
To learn more visit the Shopify Help Center or the Community Blog.
Hi Greg,
I have one for you:
@SirMiha it is being served via x-robots-tag response header and not meta tag
https://www.beta-wellness.com/wpm@0.0.239@1bfdbe36waf26f8b1p2c0f542dm9f61a9b0/sandbox/products/spaboosterseat
alt-svc: h3=":443"; ma=86400, h3-29=":443"; ma=86400
cf-cache-status: DYNAMIC
cf-ray: 7aaed3503d9c033f-ORD
content-encoding: gzip
content-language: de
content-security-policy: block-all-mixed-content; frame-ancestors 'self'; upgrade-insecure-requests;
content-type: text/html; charset=utf-8
date: Mon, 20 Mar 2023 15:00:33 GMT
etag: cacheable:c86788f50d0a3c07ccea75e061c4ed39
link: <https://cdn.shopify.com>; rel="preconnect", <https://cdn.shopify.com>; rel="preconnect"; crossorigin
nel: {"success_fraction":0.01,"report_to":"cf-nel","max_age":604800}
report-to: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=BknMdUVV6AcI2H96Kj2WC8pHqfbMEK9LBFmhlYgcuTUvlQgGqRWU%2FZsKx%2Bi2%2Ffyn2Kjzw%2BbEz2ARjyPIT%2BMocwI7oEddO%2FcPOtAwHennu%2BGdIdKqW1IfXcykNXmqOOv%2FlsvNUfJzMw%3D%3D"}],"group":"cf-nel","max_age":604800}
server: cloudflare
server-timing: processing;dur=16, db;dur=7, asn;desc="10796", edge;desc="ORD", country;desc="US", theme;desc="beta-store/AIRTABLE-LIVE", cfRequestDuration;dur=13137.000084
strict-transport-security: max-age=7889238
vary: Accept
x-alternate-cache-key: cacheable:4f074f8e018dc4a7161bbd860ecc9e2a
x-cache: miss
x-content-type-options: nosniff
x-dc: gcp-us-central1,gcp-us-central1,gcp-us-central1
x-download-options: noopen
x-frame-options: SAMEORIGIN
x-permitted-cross-domain-policies: none
x-request-id: 524029fa-3cba-4d8c-968a-0f61842a5b12
x-robots-tag: noindex, nofollow
x-shardid: 199
x-shopid: 53371273416
x-shopify-stage: production
x-sorting-hat-podid: 199
x-sorting-hat-shopid: 53371273416
x-storefront-renderer-rendered: 1
x-xss-protection: 1; mode=block
200
To learn more visit the Shopify Help Center or the Community Blog.
Thank you very much for the quick reply....it works!
BG,
Miha
Hi @Greg-Bernhardt ,
The Google Search Console still does not check the noindex tag. I have put the pages into the recheck, but the GSC says the problem still persists......one example is this page:
https://www.beta-wellness.com/wpm@0.0.255@5dd7309bw0a4825d1pb4247666mb80b6589/sandbox/
I think our robots.txt (https://www.beta-wellness.com/robots.txt) is not the problem, because I have deleted the "wpm@" part out of it. Could you maybe add the "noindex" tag as a meta tag? Please?
BG,
Miha
|
What I have learned from @Greg-Bernhardt is to remove our added Disallow:/wpm@* so that noindex tag can work efficiently. It is slightly working for us.
You mean from the robots.txt right? I have done that 🙂 (beta-wellness.com/robots.txt), still not working properly....
I think its working but took sometime
What is the second screenshot? Still pages affected.....
I just checked one of the pixels pages and its not indexable . So hopefully this nightmare is over now. Its will take a while for GSC to update. Hopefully no more issues.
Also being served via x-robots-tag response header....check here:
https://site-analyzer.pro/services-seo/check-server-request/
Hey Community! As the holiday season unfolds, we want to extend heartfelt thanks to a...
By JasonH Dec 6, 2024Dropshipping, a high-growth, $226 billion-dollar industry, remains a highly dynamic bus...
By JasonH Nov 27, 2024Hey Community! It’s time to share some appreciation and celebrate what we have accomplis...
By JasonH Nov 14, 2024