We have a Shopify app and for one of our client we are having an issue with Google Search Console sitemap submission.
They recently changed the store domain from .com.au to .com and we (Shopify app) have created Google Search Console property for the new .com domain and daily we are submitting the store sitemap. But in the GSC > Sitemaps section it is showing couldn’t fetch error.
A while back the client had a different website on a different ecommerce platform on the .com. When they switched to Shopify they decided to use the .com.au instead and use the .com as a secondary domain.
The problem was there was a completely different structure in place, so google cached incorrect paths that were still trying to redirect to the old website and they used the removal tool of the sitemap of the .com to fix the issue eventually. Is this affecting the .com sitemap fetching or the issue is something else?
Can anyone please help to find the issue? Thanks in advance.
I have access to 4 completely different Shopify properties in Search console and all the google bot crawl requests have flatlined.
While the other 2 sites i have access to(Wordpress) are being crawled as normal. Have found other people on twitter facing the same issue. Tried to contact Shopify and Google and no one wants to know about it.
Yes and the thing is, those xml files does not even exist or open with sitemap code if you put in the url, the whole xml page does not work in general, but Shopify says its a Google issue. Shopify is the ones that creates the sitemap files so how can a missing file be Googles issue.