Help! Product Pages not getting indexed

Tourist
14 0 1

Hi,

 

I launched https://mywish.ge/ in the middle of July and the collections were indexed instantly.

 

I already submitted sitemaps to Google Search Console about 10 days ago but still, I see that there are product pages in the sitemap, but Google doesn't index them.

 

Here is the robots.txt https://mywish.ge/robots.txt, I don't see any issues here.

 

The descriptions of the products are unique, handwritten!

 

What could be a problem?

 

Best,

 

Vaz

0 Likes
Shopify Staff
Shopify Staff
438 18 48

Hey, @Vazma!

 

Ava here from Shopify :)

 

Sorry to hear this process is taking a while! After you've submitted your sitemap to Google, it can take up to 4 weeks for it to be indexed fully. At the moment, there is no way of speeding up this process or forcing the indexing through manually. It can just take some time! If you have already been waiting 10 days, I'm sure you won't have to wait much longer!

 

Are you indexing a custom domain, or your myshopify.com domain? If you've already verified your domain, have you set up email forwarding yet? 

 

Speak soon!

 

Ava

 

Social Care

 

 

Ava | Social Care @ Shopify
 - Was my reply helpful? Click Like to let me know! 
 - Was your question answered? Mark it as an Accepted Solution
 - To learn more visit the Shopify Help Center or the Shopify Blog

0 Likes
Tourist
14 0 1

@Ava 

 

I have a custom domain.

 

"If you've already verified your domain, have you set up email forwarding yet?"

 

Why do I need it? I use Zoho for emails and they work fine :)

 

"it can take up to 4 weeks for it to be indexed fully."

 

Looks like Google changed its approach to Indexing.  It wasn't taking that much before.

 

Thanks for your help. I have no choice but to wait :)

0 Likes
Shopify Staff
Shopify Staff
438 18 48

Ah, no worries! If you're using Zoho mail for emails and it is working for you, that's perfect!

 

I've been doing some extra digging on the topic (just to be sure) and was just about to update my original post here. There is an option through Indexing API that will show you how to get your site crawled faster. Google states that this method would work best for sites that have live job postings or livestream videos,  as those pages need to be kept consistently up to date with new information. I've found an article where another person has posted a guide on how they achieved this themselves, you can check it out here if you're interested! This could be something you could implement yourself if you are comfortable with API, but if you are not, you could definitely reach out to a web developer or a Shopify Expert to help you out here.

 

If this is not something you are interested in though, you do also have the option of just waiting a little while longer.

 

Let me know what you think! 

 

Ava

 

Social Care

Ava | Social Care @ Shopify
 - Was my reply helpful? Click Like to let me know! 
 - Was your question answered? Mark it as an Accepted Solution
 - To learn more visit the Shopify Help Center or the Shopify Blog

0 Likes
Tourist
14 0 1

@Ava 

 

Will try that. Thanks for the solution!

 

Have a great day.

1 Like
Highlighted
Tourist
14 0 1

My Index Coverage dashboard just updated and I see that my product pages are being excluded.

 

Discovered - currently not indexed
Status: Excluded
Why is that and how do I fix it?
0 Likes
Tourist
14 0 1

There is a very strange issue with Robots.txt

 

When I open https://mywish.ge/robots.txt with my Chrome it shows:

 

# we use Shopify as our ecommerce platform

User-agent: *
Disallow: /

Which is bad.

 

And when I open it with Incognito Window it looks okay.

 

Can somebody explain this? 

0 Likes
Shopify Staff
Shopify Staff
438 18 48

Hey again, @Vazma!

 

This would be normal behaviour for your robot.txt file. Search engines like Google constantly crawl the internet in search of new data, so your store's robots.txt file blocks page content that might otherwise reduce the effectiveness of your SEO strategy by stealing PageRank. For example, the shopping cart page is included because Google indexing the cart page doesn't help potential visitors discover the home page or products. The /cart page is where visitors begin checkout, logistically you will want visitors to land on product pages before the cart page, not the other way around. The robots.txt file also disallows the checkout, orders, and admin page.

 

You would not be able to edit the robot.txt file. I can see from the link you've shared that product pages are not disallowed. All you will see in the file is what is *disallowed*, so if you cannot see something, then it is allowed.

 

Could you show me a screenshot from your Google Search Console where you are seeing product pages being excluded? Or were there any lines of the robot.txt file you were particularly unsure about? 

 

Speak soon!

 

Ava

 

Social Care 

Ava | Social Care @ Shopify
 - Was my reply helpful? Click Like to let me know! 
 - Was your question answered? Mark it as an Accepted Solution
 - To learn more visit the Shopify Help Center or the Shopify Blog

0 Likes
Tourist
14 0 1

@Ava 

 

I understand that (PageRank stuff) but what I don't understand is that opening robots file through normal (non-incognito) Chrome Window shows:

 

1.PNG

Even after using CTRL+SHIFT+R

 

And after opening in Incognito Window it looks normal:

 

2.PNG

 

Also, today morning, when I posted my reply, there were 26+ pages excluded and now it's reduced to only one:

 

Screenshot.png

 

The rest remains the same - my product pages aren't getting indexed.

 

Screenshot (1).png

I am resubmitting sitemap each day but no positive signs.

 

:/

0 Likes
Tourist
14 0 1

@Ava 

 

Search console has been updated. Here is the latest screenshot.

 

Screenshot (3).png

0 Likes