I have a problem where most pages on my website are showing as Discovered - currently not indexed in Search Console.
I have my sitemap submitted (/sitemap.xml) and URLs are discovered in it. It has also been read (although last read is 3 weeks ago).
I have been getting this "Discovered - currently not indexed" error for around 5 months now. In the meantime, some URLs have been indexed but for many their last crawl was 2 months ago.
I have checked robots.txt and noindex tags on the pages, which don't seem to be a problem.
Any help would be appreciated!
This seems to be a common problem for Shopify store owners. According to Google:
"Discovered - currently not indexed: The page was found by Google, but not crawled yet. Typically, Google wanted to crawl the URL but this was expected to overload the site; therefore Google rescheduled the crawl. This is why the last crawl date is empty on the report."
My best guess is that Shopify is now hosted on GCP (Google Cloud Platform), therefore, Google doesn't want to crawl the pages and put immeasurable pressure on the server at the busier time of the year.
- Solution: I would recommend creating a blogger account, backlink to the website and any particular inner pages you definitely want to be indexed, then add the blogger to Google Search Console and force a crawl there.
Hope the solution works!
Meanwhile, we're offering free SEO service. The free project includes basic SEO audit (keyword audit & analysis, backlink audit & analysis and site audit) + 1 SEO solution of your choice (Page Speed Analysis, Canonical & Duplicates, Article Writing, Title Tag Optimization, Meta description Optimization, Content Optimization, Image Optimization, Heading Tag Optimization). Feel free to sign-up via this link.
This appears to be a widespread issue among Shopify business owners. Google states: " "iscovered - presently not indexed: Google discovered the page but has not yet crawled it. Normally, Google would crawl the URL, but this was predicted to overburden the site, thus Google delayed the crawl. This is why the report's last crawl date is blank."
My best assumption is that Shopify is now housed on GCP (Google Cloud Platform), thus Google doesn't want to crawl the sites and put undue strain on the server during the busy season.
I would consider creating a blogger account, backlinking to the website and any specific inner pages you wish to be crawled, and then adding the blogger to Google Search Console and forcing a crawl there.
I hope the remedy is effective!
In the meanwhile, we are providing free SEO services. The free project includes a basic SEO audit (keyword audit and analysis, backlink audit and analysis, and site audit) as well as one SEO solution of your choosing (Page Speed Analysis, Canonical & Duplicates, Article Writing, Title Tag Optimization, Meta description Optimization, Content Optimization, Image Optimization, Heading Tag Optimization). Please sign up using this link.
I hope you are doing well!
URLs are Discovered - currently not indexed, which is the major issue faced by website owners these days. As per Google, it means your pages were discovered, but Google cannot crawl them or doesn't think that they are worth crawling at that moment. This decision is not permanent, and Google may crawl or index these pages at a later date, but it is not certain.
The problem may arise for certain reasons given below :
1- Content Overloaded: When your pages have heavy content or images, it increases the server's load and affects the page speed. That’s why Google left those pages to crawl at that time, so it would not affect your server. Try to use Webp images that reduce your image size for the solution. Also, don’t put too much content on it. That helps to make the server free while the crawler visits the page, and the crawler can easily crawl your pages.
2- Internal Linking: Not using proper internal linking on the pages may cause harm in indexing, so try to link the non-indexed pages with the indexed ones by using LSI Keywords.
3- Canonical URL: Keep a close check on your website's technical SEO parts and the canonical URLs. As such, if the content on the pages remains the same, Google may leave it out for crawling. And you can also use those pages to robots.txt tag so the crawler can avoid those pages while crawling.
4- Crawl Budget: An exhausted crawl budget may be one reason for “URLs discovered but not indexed” when working with large websites, so keep prioritizing your page URLs to solve the former issue. And add the unnecessary or duplicate pages to the robots.txt file.
5- Server issue: Facing server issues is a common reason for “URLS discovered but not indexed". It happens when lots of traffic comes to your website and your server can’t take the load. Similarly, when a crawler tries to crawl your website, the same issue arises. For solutions, contact your host provider for the same.
6- Ping submission: You can do ping submission for the issue URLs, which will help those pages get indexed on those ping sites and will increase the chances of getting crawled by Google.
7- Sitemap: Submit your sitemap every time you create a new post. That also helps to fix the issues.
Hope these solutions might be helpful and fruitful for you!
Dive into the world of Print-on-Demand and discover how it can transform your online busin...By Imogen Feb 19, 2024
Discover how to transform your online store into a delightful shopping destination. Learn ...By Ollie Feb 13, 2024