Blocked by robots.txt

Blocked by robots.txt

ahmed2334
Visitor
2 0 0

Screenshot 2024-10-09 at 11.27.48 PM.png

The red mark two lines are blocking most of the pages of my shopify store. I did a lot of research on the robots.txt and most of the time, I found out that it is not a very good idea to edit robots.txt file, rather to keep it as default. But in the google search console, I can see almost 1600 pages are being blocked by robots.txt and those pages are more likely important for the website. I really want them to be indexed.
When I click on any page that are blocked by robots.txt, I see they are in any of the two red marked line;

Disallow: /collections/*sort_by*

Disallow: /*/collections/*sort_by*

 

Now what can I do to solve this issue?
Will it be a good idea to create a robot template file in the shopify edit code and allow these two lines keeping rest of the robots.txt file disallowed?

Or any other way to solve this?

 

 

Reply 1 (1)
ahmed2334
Visitor
2 0 0

Hello Mayor,

Thanks for your suggestions.

 

First of all, I would like to let you know that I have created a custom robots.txt template. After creating the default template, I have added the rule to Allow specific pages, not just taking the current robots.txt file and disallowing the two red marked lines. This way I have followed the default template of shopify while keeping the identified pages allowed.

For example, in the shopify custom robots.txt template, keeping the template intact, I have added the rules like,

Allow: /collections/*sort_by*=price-ascending

 

*I have seen most of the blocked pages fall into these type of category,

others are created-ascending, title-ascending, manual followed by sort_by tag.

 

 

After changing the rule, I have submitted the new sitemap to google.

 

Let’s  see how it goes and what it can bring for my website. I will continue looking closely at the site performance.

 

please let me know if I am doing the right thing or anything else I can do.

 

Thanks