I have a multi-currency multi-territory ecommerce custom website, and I'm considering moving everything to Shopify.
My idea to test things progressively would be to create a Shopify store for one territory, for instance Belgium, and run a split test (for instance with Google Content Experiment) of the current website vs the new Shopify website, and see if there's an increase in converstion rate.
I'm afraid this might create some duplicate content. Is there a risk? What would be the best thing to do?
If the test is successful, I'll need to create multiple stores. Again, what's the best practice regarding SEO? I kind of understood that creating subdomains (eg fr.mycompany.com, uk.mycompany.com) was not the recommended option. Can I create sub-folders like mycompany.com/fr/ and mycompany.com/uk/ that would redirect to different Shopify stores?
From our experience this won't matter, duplicate content has been blown out of proportion a bit as far as how much it effects SEO. The reason duplicate content was an issue is back when blog farms use to scrape a whole bunch of websites and use their content to rank well so the fast solution for Google to remove these was to heavily penalise duplicate content giving only credit to the original site. Google is a lot smarter these days and as such it is much better at distinguishing whats good duplicate content and what isn't.
Also with cross country domains you can use subdomains and there is no issue in doing so. For optimal SEO you are best to implement hreflang and a good google document on this is here https://support.google.com/webmasters/answer/189077?hl=en
Just a segment from google directly
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
As an SEO specialist with more than 4 years of experience, I can assure you that duplicate content can be a huge problem! And when you create A/B tests, make sure that only one version of the page (the original one) is accessible for Google bot, so you will need to add the following directive into the header of your experiment page
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">.
As you are now as far as I understand are interested in usability of your website, maybe this checklist will be of help to you - http://promo.do/Uxrs