We’ve been trying to troubleshoot some issues with Google Merchant Center over the past few days (product mobile pages and desktop pages are not crawlable by Googlebot or Googlebot-Image). This problem results in products being disapproved on Google Merchant Center, and we cannot advertise our products. See example screenshot below:
I know we can’t edit the robots.txt file; though, I hope that the internal team at Shopify would keep their file updated in line with Google Merchant Center’s guidelines?
Is there anything that we can do on our side to fix this issue? (of mobile pages and desktop pages not being crawlable by Googlebot or Googlebot-Image)?
It doesn’t seem to be active again after a few days. There are a consistent ~10% of our products in a disapproved state, and this is affecting a substantial amount of volume for our shopping ads campaigns.
I have reached out to Shopify support but after 3 emails, and 2 chat conversations, all they have been able to tell me is that the robots.txt file cannot be edited (which I already knew).
I’ve just sent them an email explaining the issue of product pages not being crawlable, and asked for the issue to be investigated.
Is there any way you know of to get issues escalated to Tier 2 support at Shopify?
Just a quick update about the robot.txt file that was referenced in this post. As of today, June 21st, 2021, we have launched the ability to edit the robot.txt file to give merchants more control over the information that is crawled by search engines. You can learn more about how to edit your robot.txt file through our community post here.
If you have any questions about the new feature, please do not hesitate to create a new post under our “Techincal QA” board.
in line with Google’s documentation here and here.
This resolved our Merchant Center issues, and also resolved our Search Console issues (our domain was not being crawled by Google - in Google Search Console we had the warning “Indexed, though blocked by robots.txt”).
I’m reaching out for some advice regarding an issue we’re facing with our Shopify store in the Google Merchant Center. We have been notified to update our robots.txt file to allow “Googlebot” and “Googlebot-Image” access for crawling our site. However, upon reviewing our robots.txt file, it doesn’t seem like we are blocking these user-agents.
Has anyone else encountered a similar issue? I’m trying to figure out if there’s a default setting in Shopify that might be causing this problem or if I’m missing something in the robots.txt configuration. Any insights or experiences you can share would be extremely helpful.
We’ve received a suggestion from Google Ads to modify our robots.txt for a full-site crawl, as follows:
makefile
Copy code
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
This differs from your previous advice. Your thoughts on this change would be valuable.
Thank you for your suggestion to block the admin URL. I agree with this approach for enhanced security and will also consider adding Storebot-Google as you recommended. I’ll ensure these changes are implemented promptly.