Which pages to disallow in robots.txt ? Do we have to manually create robots.txt in shopify?

Does anybody over here maintains a robots.txt ? In shopify do we have to manually submit robots.txt through google search console, Or is it done automatically by shopify? Should I disallow my policy pages in robots.txt , and how to dissalow the admin login page, & also tell me which other internal files should we disallow, and what about cart, checkout,thank you page?

Instead of mentioning them in robots.txt

we can disallow them individually using no index tag on a specific page.

1 Like

I understand that that but I don’t know how to disaalow admin pages, and i did’nt know which other pages I should dissallow

If you have to ask what pages to disallow, it’s best you do not touch your robots.txt file.

I only dissallowed my policy pages, because I thought why should we index these if we do not want them to appear in search results.
After that, I checked how my robots.txt file on web. And saved the file and submitted it on google search console also. Now, the problem is that search console gives me notification that my product pages are not indexed. And it is showing the urls without www ie if any page page is www.mysite.com/product1 then it shows that mysite.com/product1 is not indexed and has a redirect. What seems to be the problem here. Can you tell me about it

This is why playing with a robots.txt file is dangerous. If you can share the file showing the rules, someone might be able to help.

1 Like

agree with you!