Google Indexing Issues May 2021

PMac
Tourist
4 0 2

Hey guys,

 

Ive been having a bunch of problems with my indexing on google so here the load down.

I have put a sitemap on google search console and I have 3000 pages and currently 2.46k are not indexed by google. I have 1.6k discovered-but not currently indexed.  When i go and click on a product page and inspect the page and then go to test the page it says 1 valid item with 4 warnings :

Missing field 'aggregate rating" (optional)
Missing field review (optional)
No global identifier (optional) which i actually do have on all my products now
Missing field price valid until (optional)

Are these whats keeping my page from being indexed? Ive attached screenshots below. It says theyre optional but I was reading about rich texts for google shopping ads and that it may not post my shopping ads if theyre not rich in information? Anyways whats the best way to fix this quickly?

I only discovered this when I went to go search for my google shopping ads and they're not showing up. Also here a link to my website page

:https://blackdiamondlighting.com/

Screen Shot 2021-05-17 at 11.39.03 AM.pngScreen Shot 2021-05-16 at 11.05.29 PM.pngScreen Shot 2021-05-16 at 11.05.46 PM.png

Screen Shot 2021-05-16 at 11.06.01 PM.png

KieranR
Shopify Partner
252 21 75

No the schema warnings shouldn't prevent indexing, but they are especially important for enabling features and generally are important trying to get your PDPs to show up and get clicked in organic search.

However, there are a number of per URL properties which can imply or control chance to be indexed. Things like noindex tags, robots.txt, canonical tags, hreflang tags, soft 404a, duplicate content, lack of internal links etc. You may not see all of this detail in GSC. A crawl tool like ScreamingFrog or Sitebulb would let you explore this in a bit more depth. So look into all that.

Then put it in context, what is expected behavior on Shopify? Is what you're seeing normal? Is it actually a bad thing? Do you need/want those pages showing up anyway? Are they just duplicate pages so not needed? etc. Shopify has a number of intentionally excluded and typically low-value (for SEO) pages from indexing out of the box. Pages like multi-tag filtered collections and nested collection/product URLs. 

Then once you've gathered more info you can construct a fix. It's hard to jump straight to a "quick fix" without all the right context. If you can drop in a few example URLs that are not indexed (that you think should be) I can take a look?

Full time Shopify SEO guy, based in NZ. Sometimes freelance outside the 9-5.
PMac
Tourist
4 0 2

Hey Kieran thanks for responding,

Im a little novice here in the google search console world and still learning lots lol. And oh so youre saying that those missing fields wouldnt stop my products from getting indexed? I was also wondering if my products had any robots.txt on them but don't know how to figure that out here are a couple product pages that are not indexed:

https://blackdiamondlighting.com/products/539phms-536p-orb

https://blackdiamondlighting.com/products/cmt-143fh-clr-pc

https://blackdiamondlighting.com/products/bry-s-bk-698

https://blackdiamondlighting.com/products/noa121-s-2400

https://blackdiamondlighting.com/products/mdn-91p-bk

and yes to answer your question I do need all these pages indexed as i'm currently trying to run advertisements for them as well. Ive ran a couple through screaming frog and nothing pumped up on my end (I might be reading it wrong) would love your feedback and help. Thanks a ton

(side note how do i fix those schema warnings) 

-Pete

 

 

 

 

0 Likes
PMac
Tourist
4 0 2

Hey thanks for the response. When i click on the request indexing button it says im only allowed to do it like 10 times a day. I have 2400 pages not currently indexed lol.

KieranR
Shopify Partner
252 21 75

Yeah the reply from @noahliam doesn't make sense to me, appears to be link spam. Request request manual indexing is not a new feature in Google Search Console and as you've discovered, not a practical fix with the daily limits and the number of pages you have.

The missing structured data wont prevent indexing. Also, being indexed in and of itself in Google organic search, won't affect any ads. What will affect ads is if the adbots (which periodically crawl ad landing page URLs themselves) can't access those URLs at all, or if they detect other technical or ad policy violations which result in ad disapproval. 

Anyway, I ran a crawl on your entire site in SF, then checked those URL's. Here's some observations. All seems fairly normal, those URLs are in the sitemap and linked to internally.

Some initial thoughts:

  • Maybe just wait a bit, how long has the site been live? How long have these pages been live? 
  • Get some more backlinks, even just a few directory and profile links to home to give you a baseline of backlinks/authority. If crawl budget is an issue, it may help a wee bit.
  • If you haven't already just submit your primary, 4x child sitemap.xml URLs to GSC - may help a bit too.
  • I'm wondering if there are wider crawl budget issues, these can be tricky to pin down on Shopify due to lack of server logs, but if you want to get really technical it's possible with CloudFlare O2O and Logflare maybe, but this is still a guess, and would only help you find the problem not fix it.
  • A good overview here: 

To fix the schema warnings

You need to fix the schema. To give a proper answer, need to look at what's setup and then find some options. But in saying that, the two most common ways: 

  1. Fix in place - read the warnings in GSC, and get someone to write fully custom code to implement a fix for each. Can be a bit techy and time consuming to manually do this, but not difficult if you know how to code a theme in Shopify/Liquid/HTML.
  2. Or skip some of that and get schema app (that does JSON-LD) with the code pre-done. Then just clean out the old schema from the theme (also a bit techy, but much quicker) and you're good.
Full time Shopify SEO guy, based in NZ. Sometimes freelance outside the 9-5.
PMac
Tourist
4 0 2

Hi Kieran 

Thanks again for the reply

I was wondering If i should have more then one sitemap submitted to google? Help google confirm my pages? If so how do I go abouts doing that?

Also I read somewhere that someone else had a problem and that it was an seo app they downloaded onto their site. I also had a seo app called avada that might have been changing my meta tags and alt tags it also created me an htmp site map and downloaded something into my shopify store theme? I deleted it and just re submitted my site map in to google to check.  

Also in terms of hiring someone to fix it all do you recommend a service or site I can go to to outsource that. Im all for learning certain parts of coding in the back end but if like you said its super technical I dont want to screw up my site. 

PS: Thanks again for the schema help I'll look into it

0 Likes