Discuss and resolve questions on Liquid, JavaScript, themes, sales channels, and site speed enhancements.
I launched a site back on Thanksgiving and so far only 4 pages appears when I google "site:amazonasfoodsonline.com". I made sure to add my sitemap.xml for both amazonasfoodsonline.com and www.amazonasfoodsonline.com about 2 weeks ago. I noticed a lot of items marked disallow on the robots.txt file and my search console sitemaps excluded 354 out of 360 links.
I contacted Shopify support and they told me I should contact Google support about it but didn't provide any specifics on what the issue was.
Before attempting to connect with Google support, I wanted to find out if the community has any ideas.
site: http://amazonasfoodsonline.com
robot: https://amazonasfoodsonline.com/robots.txt
Thanks
Crickets... Any advise?
Hello Shopify,
Anybody have any advice on this? I still can't get it to work. It excludes everything except for 6 pages. I submitted my sitemap to Bing, and it indexed everything in 5 minutes.
I actually have Google crawling a few antique pages which might be not presupposed to exist, and nevertheless indexing page. How can I inform google not to move slowly them if I cannot edit the robots.Txt file? They are already not submitted within the sitemap...
Since June 2021 Shopify allow to change the robots.txt file.
You can start from here: Editing robots.txt.liquid
Pay attention that Google Search doesn't respect robots.txt, you need to use noindex meta in your pages, to be sure they are not indexed by Google.
June brought summer energy to our community. Members jumped in with solutions, clicked ...
By JasonH Jun 5, 2025Learn how to build powerful custom workflows in Shopify Flow with expert guidance from ...
By Jacqui May 7, 2025Did You Know? May is named after Maia, the Roman goddess of growth and flourishing! ...
By JasonH May 2, 2025