Why isn't Google indexing all pages of my website?

I launched a site back on Thanksgiving and so far only 4 pages appears when I google "site:amazonasfoodsonline.com". I made sure to add my sitemap.xml for both amazonasfoodsonline.com and www.amazonasfoodsonline.com about 2 weeks ago. I noticed a lot of items marked disallow on the robots.txt file and my search console sitemaps excluded 354 out of 360 links.

I contacted Shopify support and they told me I should contact Google support about it but didn’t provide any specifics on what the issue was.

Before attempting to connect with Google support, I wanted to find out if the community has any ideas.

site: http://amazonasfoodsonline.com

robot: https://amazonasfoodsonline.com/robots.txt

Thanks

Crickets… Any advise?

Anybody have any advice on this issue? Thanks!

Hello Shopify,

Anybody have any advice on this? I still can’t get it to work. It excludes everything except for 6 pages. I submitted my sitemap to Bing, and it indexed everything in 5 minutes.

I actually have Google crawling a few antique pages which might be not presupposed to exist, and nevertheless indexing page. How can I inform google not to move slowly them if I cannot edit the robots.Txt file? They are already not submitted within the sitemap…

@gthsee33

Since June 2021 Shopify allow to change the robots.txt file.

You can start from here: Editing robots.txt.liquid

Pay attention that Google Search doesn’t respect robots.txt, you need to use noindex meta in your pages, to be sure they are not indexed by Google.