Hello! I've just recently launched my Shopify shop https://akindcloth.co.uk
Following Shopify's instructions, I've been trying to submit my sitemap to Google Console. On the Sitemaps overview page, the status is a success. However, the Discovered URLs column reads as 0.
Does anybody know why this is and what I should do? On the Shopify help pages, it just says that you should only submit the domain page so my submitted sitemap is: https://akindcloth.co.uk/sitemap.xml
Any ideas or help much appreciated. I've been reading so much about SEO and indexing and crawling!
Google usually finds the sitemap itself automatically even if it's unlisted in GSC. On non Shopify stores you can prove this through log file analysis. So I wouldn't freak out over this issue.
Also, when did you add it to GSC. I'd usually give it a couple days to let it parse the sitemap and show it in GSC.
Hi KieranR, thank you for your reply.
The sitemap was submitted, and 'last read' on the 28 Feb so I guess it hasn't even been a week yet! I worry as Bing discovered all the URLs immediately. Here's hoping it is just Google taking their time, and my site will be found somehow. I'm just getting very confused and overwhelmed with the whole SEO thing!
Hmm weird. Maybe I'm missing something or there's a more obvious answer here that someone else can drop in, but I don't know off the top of my head sorry.
If it were me I'd just start throwing things against the wall:
Kinda one of those unstructured "how to problem solve" things. My own approach is generally: look, test, check if worked, if not try something else, and repeat. I'm sure eventually you can get it going.
Ask Shopify to verify logs and see that sitemap.xml is being correctly accessed by Googlebot. Or you can create a more complex setup with something like CloudFlare O2O on a paid plan and check logs yourself with LogFlare.
How many pages are there? Could try manual crawl requests, IndexingAPI, or submitting via the ping link eg: http://www.google.com/ping?sitemap=http://yourwebsite.com/sitemap_url.xml
Are they orphan pages? What about creating a straightforward human sitemap page with just a dump of a ScreamingFrog crawl, listing all the internal links, get that page crawl requested, give it a few days does that do anything?
Not JS content blocking standard crawling right? You haven't got heaps of nofollow links? Not running any funky proxy, WAF or CDN stuff in front of Shopify? No robots redirect or sitemap redirect customization that's breaking it from working normally right?
^ if you're just saying no, no no to the above, then how do you KNOW how have you actually checked?
Some more ideas over here:
Throw up a few Google Ads with some spend? I have no proof this works other than anecdotally, but try it
Get a few elementary backlinks sorted eg - social media profiles, business directories. Get over that initial bump and get a handful of backlinks that Google can see.
Don't give up!