Why I'm not able to:
1. Edit robots.txt
Current one is not SEO-friendly (please don't answer that I can exclude some subpages by putting in their meta section tag for robots has value "noindex" because those subpages are still in sitemap).
Additionally, I can't exclude some spammy traffic and add some more disallows.
2. Edit sitemap.xml
I want to have in sitemap only collection and products - other things are useless for my SEO.
3. Change anything on checkout process.
Your fraud analysis are weak! Apps recommended by you didn't work. I can't add any JS code and verify customers on platforms dealing with this problem by using AI and Machine Learning.
To be honest... At this moment I feel choose Shopify was my worst decision ever.
I can't even filter traffic from specific IP to block scammers, frauders, botnets.
From my analysis I spend lot of money on doing remarketing on bots - sic!
You can't edit robots.txt, nor do you need to.
You can't edite sitemap.xml though you can cause items to be ignored using a metafield. Again, don't really need to.
If you think seo depends on either of these two things you are mistaken about seo. Google doesn't even need a sitemap to find your pages. Your content is far far far far more important than what's in your sitemap file.
Well... Based on my knowledge about SEO I totally disagree with You.
About making changes and setting meta robots noindex - it's not working so efficient and fast as robots.txt.
Finally, it's big mistake and it's totally wrong from SEO perspective linking subpages with meta tag robots noindex and include them to sitemap.
It's not logical to me.
My shop is specific, I can't create long and unique product description for each of them, so I need focus mostly on SEO and site structure.
Shopify limits the ability to edit the `robots.txt` file.
Although you can't edit the content of your store's robots.txt file, you should be aware of the content that it blocks from search engines. For example, the shopping cart page is blocked from search engines because you want customers to find your product pages before the cart page.
You will have to take it up with Shopify. The community cannot tell you how to access and edit it.
I can tell you from years of experience that it does not matter.
Google is smart enough. Any page where they find a meta noindex tag in the html they will omit from the index, eventually. Yes it's not immediate and it can take a long time for them fade out, but it's also not going to go any faster by modifying the robots.txt/sitemap.xml. At this point Google is way beyond using the robots.txt and sitemap.xml to figure out where things are and what to do with them.
What I did on my site is for pages that I don't want indexed, I put a metafield and then pull from the metafield in the theme and add the meta noindex tag. We do this for all page=2, page=3 etc pages as well as a large number of 'filtered' collection pages. But of course it requires some theme modifications to add some logic to decide when to apply the noindex.
That aside, I can also tell you that even if you get your site all nice and clean and only having the pages that you want in the index, indexed, there is still a significant cap on that as to how much it will influence your rankings. It's more important to have a good clear link structure where the site is hierarchically divided up, and where specific pages are very focused on key topics and related keywords and not overlapping with each other where possible. It's important to have some good seo text on collection pages, and it helps with products too. And it's also important to have some lengthy, high-quality, valuable content in your blogs/articles pages, which link to relevant subcategories.
You have to realize that Google really is only looking for one thing. Value. The highest value of all pages on the web. And one particular page of yours is going to be competing with ALL other topically relevant pages on the web, to vie for ranking position. HOW they go about measuring the value of your page is the entire purpose of their algorithm and the hundreds of metrics that they look at. Having good high quality, linked to, shared content is much more valuable than whether or not some pages are showing up in your sitemap. They are looking for signs of authority, expertise, and trustworthiness. And they are going to be looking significantly BEYOND your website to find those signs, such as other websites mentioning and linking to you and sharing your stuff. So if all you focus on is the website on-page seo itself, it's simply not enough. We tackled seo in the same way for years and got nowhere with constant traffic declines. It was not until we started churning is quality article pages and structuring the site better that we started to go up in rankings.
Check out the articles at backlinko.com by Brian Dean you'll find them very helpful.
As of today, June 21st, 2021, we have launched the ability to edit the robot.txt file to give merchants more control over the information that is crawled by search engines. You can learn more about how to edit your robot.txt file through our community post here.
Due to the age of the topic, I will be locking this thread. If you have any questions about the new feature, please do not hesitate to create a new post under our "Techincal QA" board.