Couple questions about Shopify

New Member
2 0 0

Why I'm not able to:

1. Edit robots.txt

Current one is not SEO-friendly (please don't answer that I can exclude some subpages by putting in their meta section tag for robots has value "noindex" because those subpages are still in sitemap).

Additionally, I can't exclude some spammy traffic and add some more disallows. 


2. Edit sitemap.xml

I want to have in sitemap only collection and products - other things are useless for my SEO.


3. Change anything on checkout process.

Your fraud analysis are weak! Apps recommended by you didn't work. I can't add any JS code and verify customers on platforms dealing with this problem by using AI and Machine Learning.


To be honest... At this moment I feel choose Shopify was my worst decision ever.

I can't even filter traffic from specific IP to block scammers, frauders, botnets.

From my analysis I spend lot of money on doing remarketing on bots - sic!


Great job!

Shopify Partner
248 9 63

You can't edit robots.txt, nor do you need to.

You can't edite sitemap.xml though you can cause items to be ignored using a metafield. Again, don't really need to.

If you think seo depends on either of these two things you are mistaken about seo. Google doesn't even need a sitemap to find your pages. Your content is far far far far more important than what's in your sitemap file.

New Member
2 0 0

Well... Based on my knowledge about SEO I totally disagree with You.


1. robots.txt.

  • I need to un-index lot of already indexed subpages (e.g. because of stupid Shopify collections site construction where is wrong canonical [?page=2 is targeting on itself -> it's typical mistake] search engines indexing page 2,3,4,5 with same title, main content, meta description) - Yes you can show in Google Search Console that ?page= is a variable to paginate your results but it takes months...
  • I need to un-index all /pages/ because by using it as additional tabs on products site I'm duplicating same content (creating a lot of weak subpages)
  • I get bunch of spammy traffic and can't stop this because I can't add more disallows (shop is slow and unstable and I get lot of fraud orders)

About making changes and setting meta robots noindex - it's not working so efficient and fast as robots.txt.


2. sitemap.xml

  • I need to un-index all /pages/ because by using it as additional tabs on products site I'm duplicating same content (weak subpages),
  • I can't choose which products will be indexed (e.g. I don't want to index a variations or very similar products - duplicate content).

Finally, it's big mistake and it's totally wrong from SEO perspective linking subpages with meta tag robots noindex and include them to sitemap.

It's not logical to me.


My shop is specific, I can't create long and unique product description for each of them, so I need focus mostly on SEO and site structure.

19 3 3

Shopify limits the ability to edit the `robots.txt` file.


Although you can't edit the content of your store's robots.txt file, you should be aware of the content that it blocks from search engines. For example, the shopping cart page is blocked from search engines because you want customers to find your product pages before the cart page.


You will have to take it up with Shopify. The community cannot tell you how to access and edit it.

Shopify Partner
248 9 63

I can tell you from years of experience that it does not matter.

Google is smart enough. Any page where they find a meta noindex tag in the html they will omit from the index, eventually. Yes it's not immediate and it can take a long time for them fade out, but it's also not going to go any faster by modifying the robots.txt/sitemap.xml. At this point Google is way beyond using the robots.txt and sitemap.xml to figure out where things are and what to do with them.

What I did on my site is for pages that I don't want indexed, I put a metafield and then pull from the metafield in the theme and add the meta noindex tag. We do this for all page=2, page=3 etc pages as well as a large number of 'filtered' collection pages.  But of course it requires some theme modifications to add some logic to decide when to apply the noindex.

That aside, I can also tell you that even if you get your site all nice and clean and only having the pages that you want in the index, indexed, there is still a significant cap on that as to how much it will influence your rankings. It's more important to have a good clear link structure where the site is hierarchically divided up, and where specific pages are very focused on key topics and related keywords and not overlapping with each other where possible. It's important to have some good seo text on collection pages, and it helps with products too. And it's also important to have some lengthy, high-quality, valuable content in your blogs/articles pages, which link to relevant subcategories.

You have to realize that Google really is only looking for one thing. Value. The highest value of all pages on the web. And one particular page of yours is going to be competing with ALL other topically relevant pages on the web, to vie for ranking position. HOW they go about  measuring the value of your page is the entire purpose of their algorithm and the hundreds of metrics that they look at. Having good high quality, linked to, shared content is much more valuable than whether or not some pages are showing up in your sitemap. They are looking for signs of authority, expertise, and trustworthiness. And they are going to be looking significantly BEYOND your website to find those signs, such as other websites mentioning and linking to you and sharing your stuff. So if all you focus on is the website on-page seo itself, it's simply not enough. We tackled seo in the same way for years and got nowhere with constant traffic declines. It was not until we started churning is quality article pages and structuring the site better that we started to go up in rankings.

Check out the articles at by Brian Dean you'll find them very helpful.

Community Moderator
Community Moderator
2570 337 524


As of today, June 21st, 2021, we have launched the ability to edit the robot.txt file to give merchants more control over the information that is crawled by search engines. You can learn more about how to edit your robot.txt file through our community post here

Due to the age of the topic, I will be locking this thread. If you have any questions about the new feature, please do not hesitate to create a new post under our "Techincal QA" board.

Trevor | Community Moderator @ Shopify 
 - Was my reply helpful? Click Like to let me know! 
 - Was your question answered? Mark it as an Accepted Solution
 - To learn more visit the Shopify Help Center or the Shopify Blog