I am using an SEO software, and it gave me a list of 262 of my pages that are "restricted from indexing." Under robot instructions, it says "partially disallow." And for restriction method, it is showing robots.txt.
I am new to this level of SEO. I've worked on titles, meta descriptions, ect., but never anything pertainign to this. I'm assuming that if I have that many pages that aren't being indexed, that that could be a problem. One of them that I looked at is normally one of my best selling products on another platform, which makes me a little concerned.
How can I fix this?
I am using SEO PowerSuite, and it's the WebSite Auditor that is giving me this information.
Here are a few links to items that they are saying are not being indexed with the reasons that I listed in my original post:
All together, they are showing 262 pages not being indexed. Some of them do seem to be pages from my previous site/platform (my site used to be on IndieMade), because I can see the 404 message coming up before it goes to my homepage.
I'm hoping there's an easy way for me to fix these, and get all of my pages indexed.
Looking forward to your insight!
Robot.txt file is what controls the bots from Google, etc to your site. Depending how it is written, it allows or disallows the SE bots to various pages on your website. You have to create the conditions in the robot.txt file to allow your pages to be seen by the bots and then it will be indexed. So there is something in the robot.txt file that is not allowing that. I just signed up with Shopify, so I am not familiar how you acces/write your robot.txt file here. At Volusion is was pretty straightforward to create one and upload it into your websites code. Not sure how at Shopify.
Thanks for your reply.
Unfortunately, from what I understand, shopify shop owners are not allowed access, at all, to our robot.txt files. I have a message in to the support team also, regarding this matter, and hoping that they will fix it for me. I haven't heard back yet.
Good luck with your new shop!
Yes, just found that out too....here is an app for nofollow/noindex which is what the robot.txt does. App will let you make the choices for indexing the pages you want indexed by seach bots.
Cost item, but only $12 per year.
Best of luck!
So I just installed this app, and tried to figure it out. From what I can tell, you can set certain things to a nofollow or noindex, but I cannot change the fact that my pages are not being indexed because of how the robot.txt file is written. The app doesn't allow you to change a URL that is already a noindex, to one that can be indexed.
Just wondering if I am mising something. I can't imagine that this can be a unsolvable problem because there have to be many, many successul websites that are indexed by Google and use Shopify.
Just wondering if I am mising something.
@Posh. Whatever is being restricted in the robots.txt file is for your benefit.
The software you're using is likely just give false positives and not considering that some urls on the site will cause duplicate content issues if used. The restrictions in the robots file help you avoid those issues by blocking the duplicate. The main url isn't blocked so you're not missing out on any google juice here.