I launched https://mywish.ge/ in the middle of July and the collections were indexed instantly.
I already submitted sitemaps to Google Search Console about 10 days ago but still, I see that there are product pages in the sitemap, but Google doesn't index them.
Here is the robots.txt https://mywish.ge/robots.txt, I don't see any issues here.
The descriptions of the products are unique, handwritten!
What could be a problem?
Ava here from Shopify :)
Sorry to hear this process is taking a while! After you've submitted your sitemap to Google, it can take up to 4 weeks for it to be indexed fully. At the moment, there is no way of speeding up this process or forcing the indexing through manually. It can just take some time! If you have already been waiting 10 days, I'm sure you won't have to wait much longer!
Are you indexing a custom domain, or your myshopify.com domain? If you've already verified your domain, have you set up email forwarding yet?
I have a custom domain.
"If you've already verified your domain, have you set up email forwarding yet?"
Why do I need it? I use Zoho for emails and they work fine :)
"it can take up to 4 weeks for it to be indexed fully."
Looks like Google changed its approach to Indexing. It wasn't taking that much before.
Thanks for your help. I have no choice but to wait :)
Ah, no worries! If you're using Zoho mail for emails and it is working for you, that's perfect!
I've been doing some extra digging on the topic (just to be sure) and was just about to update my original post here. There is an option through Indexing API that will show you how to get your site crawled faster. Google states that this method would work best for sites that have live job postings or livestream videos, as those pages need to be kept consistently up to date with new information. I've found an article where another person has posted a guide on how they achieved this themselves, you can check it out here if you're interested! This could be something you could implement yourself if you are comfortable with API, but if you are not, you could definitely reach out to a web developer or a Shopify Expert to help you out here.
If this is not something you are interested in though, you do also have the option of just waiting a little while longer.
Let me know what you think!
Hey again, @Vazma!
This would be normal behaviour for your robot.txt file. Search engines like Google constantly crawl the internet in search of new data, so your store's robots.txt file blocks page content that might otherwise reduce the effectiveness of your SEO strategy by stealing PageRank. For example, the shopping cart page is included because Google indexing the cart page doesn't help potential visitors discover the home page or products. The /cart page is where visitors begin checkout, logistically you will want visitors to land on product pages before the cart page, not the other way around. The robots.txt file also disallows the checkout, orders, and admin page.
You would not be able to edit the robot.txt file. I can see from the link you've shared that product pages are not disallowed. All you will see in the file is what is *disallowed*, so if you cannot see something, then it is allowed.
Could you show me a screenshot from your Google Search Console where you are seeing product pages being excluded? Or were there any lines of the robot.txt file you were particularly unsure about?
I understand that (PageRank stuff) but what I don't understand is that opening robots file through normal (non-incognito) Chrome Window shows:
Even after using CTRL+SHIFT+R
And after opening in Incognito Window it looks normal:
Also, today morning, when I posted my reply, there were 26+ pages excluded and now it's reduced to only one:
The rest remains the same - my product pages aren't getting indexed.
I am resubmitting sitemap each day but no positive signs.