I have submitted my website to Google Search Console and I am receiving several hundred pages that are being excluded with Discovered - currently not indexed. If I do a URL inspection of these excluded pages and "Request Indexing" the pages are being indexed about 5-10 days later. Am I missing something to getting these pages indexed any quicker or can I submit request indexing in bulk?
Thanks in advance for any help!
Personally I would leave Google to do it's thing. You used to be able to change the crawl rate in the old style search console but they removed the functionality in the new one and I seem to remember them saying you should only adjust the crawl rate if you were having issues.
So long as you have submitted your sitemap, then I would just sit back and wait.
Oh and don't forget Bing, obviously much smaller than Google but no reason not to.
You can submit your Shopify sitemap to request indexing but even if you wait Google will also eventually index your site.
It's tough to know without being able to see the individual links that are getting reported. Essentially what the report is saying is that Google is able to add them to the crawl consideration set, but for some reason it isn't indexing them. This could be due to a variety of reasons such as canonical tag issues, improper site structure etc.
One thing I noticed is that your site's internal linking structure is based around non-canonical URLs. For instance, the first product on your home page is a non-canonical URL:
This might be one of the reasons as the site is pointing to a large number of internal links but telling Google that another page should be indexed instead. It's possible that this conflicting signal is a reason. I wrote a post on how to fix this in Shopify: https://moz.com/blog/shopify-seo
Let me know if you have any other questions!
|2 hours ago|
|4 hours ago|
|4 hours ago|