I launched a site back on Thanksgiving and so far only 4 pages appears when I google "site:amazonasfoodsonline.com". I made sure to add my sitemap.xml for both amazonasfoodsonline.com and www.amazonasfoodsonline.com about 2 weeks ago. I noticed a lot of items marked disallow on the robots.txt file and my search console sitemaps excluded 354 out of 360 links.
I contacted Shopify support and they told me I should contact Google support about it but didn’t provide any specifics on what the issue was.
Before attempting to connect with Google support, I wanted to find out if the community has any ideas.
Anybody have any advice on this? I still can’t get it to work. It excludes everything except for 6 pages. I submitted my sitemap to Bing, and it indexed everything in 5 minutes.
I actually have Google crawling a few antique pages which might be not presupposed to exist, and nevertheless indexing page. How can I inform google not to move slowly them if I cannot edit the robots.Txt file? They are already not submitted within the sitemap…