some of my keywords i was ranking for the pages are being blocked and every day more and more pages are being deindexed and blocked by Excluded by ‘noindex’ tag
i haven’t changed anything, but this started 15 days ago and i’m struggling to find the robots.txt in my theme
Greetings! I am Gina from the flareAI app helping Shopify merchants get $6Million+ in sales from Google Search, on autopilot.
It sounds like your website is being affected by the robots.txt file, which is a file that tells search engine robots which pages or sections of a website to crawl or not crawl. If your pages are being blocked by the robots.txt file, it can result in a drop in traffic and rankings.
If you are unable to find the robots.txt file in your theme, it’s possible that it was created or modified by a plugin or app that you have installed on your website. You may want to check if you have any SEO plugins installed that may have created the file or modified it.
If you are still unable to locate the robots.txt file, you can try creating a new one in the root directory of your website. This can be done by creating a plain text file and naming it “robots.txt”. You can then add the necessary instructions for search engine robots to crawl or not crawl specific pages or sections of your website.
It’s also important to check if any pages on your website have a “noindex” tag, which can also cause pages to be excluded from search engine results. You may want to review your website’s meta tags and ensure that none of them contain the “noindex” tag.
Unless you fixed the robots.txt file, there is something else going on or you’re miss-interpreting the Google Search Console report (I presume that’s where you’re getting your info from.)
Hi @PrintNStuff You don’t need to do anything or it’s no need to fix because it is not an error.
These all URLs are not necessary to index in the Google Search it is bad practice so by default Shopify excluded these unnecessary suffix in there robots.txt file so your main URLs are index in the Google Search instead of this multiple URLs.
Now when this flow happen Google Search Console inspect and fetch the whole site and found lots of URLs (which is not require to index in Google Search) so Google Search Console showing those URLs in affected pages section, So you no need to worry your website is totally Google and SEO compatible.