Last April we moved our online shop from an old DIY site host to Shopify. We retained the top level domain www indigo-silver co uk but inherited a new URL structure but Google still believes all the old URLs to be searchable.
Whilst I set up a number of URL redirects, I used Google Webmaster's URL Removal tool to deal with the rest - rightly or wrongly.
Just recently Google emailed me to say that I had over 2700 404's and when I checked the removal tool, sure enough Google had expired all the removed URLs - it seems you can only temporarily remove URLs now?
Shopify don't allow the Robots.txt to be edited, so is there a way to exclude these using meta tags? Most of them are grouped into 1 of 2 URL styles so hopefully I can specify the initial URL format for those 2 styles followed by a wild card character in some way?
Any help greatly appreciated.
Thanks for reaching out here.
Definitely a pain when you have to redirect all of these dead links. What many of our merchants like to use with their Shopify store are these two apps:
Check these out and give them a try.
Hope this helps!
John | Shopify
As of today, June 21st, 2021, we have launched the ability to edit the robot.txt file to give merchants more control over the information that is crawled by search engines. You can learn more about how to edit your robot.txt file through our community post here.
Due to the age of the topic, I will be locking this thread. If you have any questions about the new feature, please do not hesitate to create a new post under our "Techincal QA" board.