Ive been having a bunch of problems with my indexing on google so here the load down.
I have put a sitemap on google search console and I have 3000 pages and currently 2.46k are not indexed by google. I have 1.6k discovered-but not currently indexed. When i go and click on a product page and inspect the page and then go to test the page it says 1 valid item with 4 warnings :
Missing field 'aggregate rating" (optional)
Missing field review (optional)
No global identifier (optional) which i actually do have on all my products now
Missing field price valid until (optional)
Are these whats keeping my page from being indexed? Ive attached screenshots below. It says theyre optional but I was reading about rich texts for google shopping ads and that it may not post my shopping ads if theyre not rich in information? Anyways whats the best way to fix this quickly?
I only discovered this when I went to go search for my google shopping ads and they're not showing up. Also here a link to my website page
No the schema warnings shouldn't prevent indexing, but they are especially important for enabling features and generally are important trying to get your PDPs to show up and get clicked in organic search.
However, there are a number of per URL properties which can imply or control chance to be indexed. Things like noindex tags, robots.txt, canonical tags, hreflang tags, soft 404a, duplicate content, lack of internal links etc. You may not see all of this detail in GSC. A crawl tool like ScreamingFrog or Sitebulb would let you explore this in a bit more depth. So look into all that.
Then put it in context, what is expected behavior on Shopify? Is what you're seeing normal? Is it actually a bad thing? Do you need/want those pages showing up anyway? Are they just duplicate pages so not needed? etc. Shopify has a number of intentionally excluded and typically low-value (for SEO) pages from indexing out of the box. Pages like multi-tag filtered collections and nested collection/product URLs.
Then once you've gathered more info you can construct a fix. It's hard to jump straight to a "quick fix" without all the right context. If you can drop in a few example URLs that are not indexed (that you think should be) I can take a look?
Hey Kieran thanks for responding,
Im a little novice here in the google search console world and still learning lots lol. And oh so youre saying that those missing fields wouldnt stop my products from getting indexed? I was also wondering if my products had any robots.txt on them but don't know how to figure that out here are a couple product pages that are not indexed:
and yes to answer your question I do need all these pages indexed as i'm currently trying to run advertisements for them as well. Ive ran a couple through screaming frog and nothing pumped up on my end (I might be reading it wrong) would love your feedback and help. Thanks a ton
(side note how do i fix those schema warnings)
Yeah the reply from @noahliam doesn't make sense to me, appears to be link spam. Request request manual indexing is not a new feature in Google Search Console and as you've discovered, not a practical fix with the daily limits and the number of pages you have.
The missing structured data wont prevent indexing. Also, being indexed in and of itself in Google organic search, won't affect any ads. What will affect ads is if the adbots (which periodically crawl ad landing page URLs themselves) can't access those URLs at all, or if they detect other technical or ad policy violations which result in ad disapproval.
Anyway, I ran a crawl on your entire site in SF, then checked those URL's. Here's some observations. All seems fairly normal, those URLs are in the sitemap and linked to internally.
Some initial thoughts:
To fix the schema warnings
You need to fix the schema. To give a proper answer, need to look at what's setup and then find some options. But in saying that, the two most common ways:
Thanks again for the reply
I was wondering If i should have more then one sitemap submitted to google? Help google confirm my pages? If so how do I go abouts doing that?
Also I read somewhere that someone else had a problem and that it was an seo app they downloaded onto their site. I also had a seo app called avada that might have been changing my meta tags and alt tags it also created me an htmp site map and downloaded something into my shopify store theme? I deleted it and just re submitted my site map in to google to check.
Also in terms of hiring someone to fix it all do you recommend a service or site I can go to to outsource that. Im all for learning certain parts of coding in the back end but if like you said its super technical I dont want to screw up my site.
PS: Thanks again for the schema help I'll look into it