How can I edit robots.txt

New Member
1 0 1

I need to edit robots.txt file because, the ""  page not getting link juice value.  How to modify robots file? 


Replies 92 (92)
New Member
1 0 1

Don's reply is unbelievable. Don't listen to him, this guy has literally 0 knowledge when it comes to optimizing for SEO. A sitemap tells Google which pages are important your site. Believe it or not it's very expensive to for Google to crawl every page on the www, so a sitemap helps when it comes to budget crawling with 1000 page websites. A sitemap DOES NOT STOP Google from crawling pages on your site. Eventually, it will still crawl and index pages which you don't want Google to see.

28 0 3

Still an issue, still no solutions or suggestions, still nothing but confrontational responses from Shopify experts, when there is any response at all of course.
New Member
1 0 0

This is my website, my half pages have been crawl but not yet indexing. I don't know what is the problem. I tried everything

Change metatag
Submit my website to search engine submission sites
Shortner URL

This problem coming since 1+ month

8 0 5

No problem - Just pay Shopify $50 extra each month for a 'Premium' account. FFS, scamsville.

8 0 5

PM me for how

7 0 2

Hey Grumble - i am desperate for a solution to this. If you have any insights, could you please hit me up at Thanks!

New Member
2 0 5

I felt compelled to add an update to this post because it backs up a post I have previously submitted to this thread.

For clarity, I work in ecommerce and specialise in commercial optimisation and growth strategies . I have been an ecommerce consultant for over 25 years and support the delivery of an estimated £100m per annum across a number of verticals. In case you hadn't realised, I think I know a bit about SEO too.

What I am about to share will astound many reading this post. It acutely demonstrates Shopify's ignorance, total lack of understanding and downright negligence when it comes to Advanced SEO provision on their platform.

Yes, I appreciate that is a damning indictment of Shopify, one which may even get this post removed and me banned but as I am about to explain, it's completely justified.

The information I am going to share to back up why Shopify are deserving of such criticism is provided by some little known company that none of us of have probably heard of and should clearly not be taking any advice from when it comes to SEO. Yeah, you guessed right. Google!

Unsurprisingly, the information which highlights how very important robots.txt directives are is nestled in the "Advanced SEO" section of their Search Central Documentation.

Shockingly for a multi-billion dollar company, this is clearly a part of the Internet, Shopify have never properly familiarised themselves with. If they had done, they might have realised the significance of Google and SEO best practice.

So, let's scrutinise exactly what Google expect us to do with a robots.txt file and let's do so by breaking it down in their own words

Block crawling of URLs that shouldn't be indexed. Some pages might be important to users, but shouldn't appear in Search results. For example, infinite scrolling pages that duplicate information on linked pages, or differently sorted versions of the same page. If you can't consolidate them as described in the first bullet, block these unimportant (for search) pages using robots.txt or the URL Parameters tool (for duplicate content reached by URL parameters). Don't use noindex, as Google will still request, but then drop the page when it sees the noindex tag, wasting crawling time. Don't use robots.txt to temporarily free up crawl budget for other pages; use robots.txt to block pages or resources that you think that we shouldn't crawl at all. Google won't shift this freed-up crawl budget to other pages unless Google is already hitting your site's serving limit.

So, let's just step through that again and compare what Google expect us to do for good SEO and what Shopify are clearly advising in this thread which is also supported by the lack of vital SEO support on their platform.

  • Google say : Don't use noindex on pages you don't want crawled. 
  • Shopify say : Use on on-page noindex via meta robots directives - This is all you're getting

  • Google say : Use robots.txt to block pages. Avoid wasting crawl budget
  • Shopify say : We don't care what Google say. No editing of robots.txt - Just waste crawl budget.

So, here we have it. In a nutshell, utter incompetence on Shopify's part.

The link for reference is here
To be clear, I know this optimisation technique applies to all sites regardless of size. I have proven it to be the case.
It doesn't just work well on massive sites, it's just way more important and effective on them.

This, what we have here is corporate negligence. The lack of feature to edit robots.txt files for merchants is commercially damaging. It will be costing all such merchants lost optimisation, hindered SEO and search visibility and ultimately revenue. Probably lots of it.

By not allowing Google to efficiently utilise crawl budget as recommended above with properly targeted and optimised robots.txt files, Shopify is quite literally holding back the search performance and search visibility of potentially tens of thousands of merchant websites.

What's more scary is that by holding all of these sites back in this way, it's probably costing Shopify revenue by hindering the growth of such merchants. It's madness in my opinion.

If you're an investor in Shopify, your investment is simply not growing at the rate it could until this is fixed.
If you're a developer at Shopify, maybe flag this one to someone higher up the chain. It's kind of important to listen to Google.
If you're a merchant on Shopify, be aware that Shopify don't care what Google advise to make your website rank well.
If you're considering Shopify as a platform, maybe consider another.
If you advise on SEO at Shopify, maybe quit and do something else you're good at.

One last point. For the record, I have actually proven that this is an issue with a real world test scenario. I have a Shopify website. The crawl rate is dreadful. If I could optimise robots.txt as I would like and block some wasteful query parameters, the efficiency would shoot up and my site would be crawled way more efficiently, more frequently and would then rank better. The lack of efficiency in crawl budget is definitely costing me in hindered growth and revenue. It's taking way too long for Google to re-crawl important pages on my site. If Shopify want to reach out for more information and supporting data, I'd be happy to share this.

Rather than dismiss people on this thread and even what Google advocate. It's maybe time for Shopify to demonstrate that you care about merchant websites, optimisation and growth and fix the issue by making robots.txt accessible and editable for advanced users who quite literally need this feature to drive the growth of their websites on your platform.

32 0 15

Hey Grumble,

I would love to hear a solution to this issue.  You previously wrote to pay Shopify $50 for premium, but I'm sure that was pure sarcasm.  They have no right to block the edits because this goes against what Google says to do.  If I could leave this platform I would.  Anyways, you also wrote to DM you so please email me at  I would really appreciate to know how.

New Member
1 0 0

having same issue now when robots.txt seems blocking our pages


will update our findings 

32 0 15

We have a right to edit the robots.txt if we choose to.  That is the point.