Hello, been dealing with google merchant suspensions for a while now. I fixed my google misrepresentation violation but a week later after fixing my website needs approval violation it triggered the misrepresentation violation to come back.
Google’s poor support once again can’t pinpoint the exact problem but they mentioned it could potentially have to do with the robots.txt file blocking google from crawling my site. They advised me to delete the following:
"Also, in order for us to access your whole site, ensure that your robots.txt file allows both user-agents ‘Googlebot’ (used for landing pages) and ‘Googlebot-image’ (used for images) to crawl your full site.
Please remove this so that the crawlers may access your entire site."
# Google adsbot ignores robots.txt unless specifically named!
User-agent: adsbot-google
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /orders
Disallow: /61037412572/checkouts
Disallow: /61037412572/orders
Disallow: /*?*oseid=*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Disallow: /cdn/wpm/*.js
Is this accurate? I’ve read elsewhere that I shouldn’t delete the robots.txt file implemented by Shopify. They also advised me to do the following:
To comply with the requirements, please set up your robots.txt exact as the template provided below:
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow: