Hi, my google ads was disapproved for the following reason: “Destination not working” they emailed me back and ask me to edit robot.txt
This is their email:
As per checking, we found that your landing page https://www.shapelyne.com is still not working because the robots.txt file on your web server, as information robots txt file, works to allow Google crawlers to access your site. The robots.txt file can usually be found in the root directory of the web server (eg http://www.example.com/robots.txt).
Furthermore, in order for us to access the site as a whole, the robots.txt file must allow the user-agents ‘Googlebot’ (used for pages) and ‘Googlebot-image’ (used for images) to crawl your site. Yes, you need to follow the instructions and do this by making the following changes to the robots.txt file:
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
So i add it at the end of my robot.txt
But it’s still rejected. Did I do it right? This is what I did to my robot.txt file:
we use Shopify as our ecommerce platform
{%- comment -%}
Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
{{- group.user_agent -}}
{% for rule in group.rules %}
{{- rule -}}
{% endfor %}
{%- if group.sitemap != blank -%}
{{ group.sitemap }}
{%- endif -%}
{% endfor %}
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow: