Why isn't Google indexing all pages of my website?

Fern88
Excursionist
11 0 1

I launched a site back on Thanksgiving and so far only 4 pages appears when I google "site:amazonasfoodsonline.com".  I made sure to add my sitemap.xml for both amazonasfoodsonline.com and www.amazonasfoodsonline.com about 2 weeks ago.    I noticed a lot of items marked  disallow on the robots.txt file and my search console sitemaps excluded 354 out of 360 links. 

I contacted Shopify support and they told me I should contact Google support about it but didn't provide any specifics on what the issue was. 

Before attempting to connect with Google support, I wanted to find out if the community has any ideas.  

site: http://amazonasfoodsonline.com

robot: https://amazonasfoodsonline.com/robots.txt

 

Thanks

afocrawl.png

 

 

Replies 5 (5)

Fern88
Excursionist
11 0 1

Crickets... Any advise?

Fern88
Excursionist
11 0 1
Anybody have any advice on this issue? Thanks!
Fern88
Excursionist
11 0 1

Hello Shopify, 

 

Anybody have any advice on this?  I still can't get it to work.  It excludes everything except for 6 pages.  I submitted my sitemap to Bing, and it indexed everything in 5 minutes.

gthsee33
Visitor
1 0 0

I actually have Google crawling a few antique pages which might be not presupposed to exist, and nevertheless indexing page. How can I inform google not to move slowly them if I cannot edit the robots.Txt file? They are already not submitted within the sitemap...

drakedev
Shopify Partner
685 148 230

@gthsee33 

Since June 2021 Shopify allow to change the robots.txt file.

You can start from here: Editing robots.txt.liquid

Pay attention that Google Search doesn't respect robots.txt, you need to use noindex meta in your pages, to be sure they are not indexed by Google.

If my answer was helpful click Like to say thanks
If the problem is solved remember to click Accept Solution
Shopify/Shopify Plus custom development: You can hire me for simple and/or complex tasks.