In my Shopify Plus account I have 5 stores. One is our main customer facing store where we do the majority of sales. The other four are subdomains are a mix; one is our dev site where we sometimes try things out before implementing on the main site. A few other subdomains are stores we've created for B2B partners for them to transact on. As the target audience is different to our main customer facing store, some of the content is tailored, however, some pages are exactly the same. As such, to avoid being penalised for duplicate content by Google, we want to disallow any crawling and indexing of these 4 subdomains.
Can this be done with Shopify? My understanding is that there is only one robots.txt file and disallowing would only work if there is a robots.txt file for each subdomain.
Any advice would be hugely appreciated!
add noindex markup on your pages (assuming they provide a 200 header) then use the URL removal tool in Google Search Console to remove the entire subdomain from being visible in search. This would require verified Google Search Console access to the subdomain. If the pages provide a 404 or 410 you can do the same thing.