Discuss and resolve questions on Liquid, JavaScript, themes, sales channels, and site speed enhancements.
I launched a site back on Thanksgiving and so far only 4 pages appears when I google "site:amazonasfoodsonline.com". I made sure to add my sitemap.xml for both amazonasfoodsonline.com and www.amazonasfoodsonline.com about 2 weeks ago. I noticed a lot of items marked disallow on the robots.txt file and my search console sitemaps excluded 354 out of 360 links.
I contacted Shopify support and they told me I should contact Google support about it but didn't provide any specifics on what the issue was.
Before attempting to connect with Google support, I wanted to find out if the community has any ideas.
site: http://amazonasfoodsonline.com
robot: https://amazonasfoodsonline.com/robots.txt
Thanks
Crickets... Any advise?
Hello Shopify,
Anybody have any advice on this? I still can't get it to work. It excludes everything except for 6 pages. I submitted my sitemap to Bing, and it indexed everything in 5 minutes.
I actually have Google crawling a few antique pages which might be not presupposed to exist, and nevertheless indexing page. How can I inform google not to move slowly them if I cannot edit the robots.Txt file? They are already not submitted within the sitemap...
Since June 2021 Shopify allow to change the robots.txt file.
You can start from here: Editing robots.txt.liquid
Pay attention that Google Search doesn't respect robots.txt, you need to use noindex meta in your pages, to be sure they are not indexed by Google.
Hey Community! As the holiday season unfolds, we want to extend heartfelt thanks to a...
By JasonH Dec 6, 2024Dropshipping, a high-growth, $226 billion-dollar industry, remains a highly dynamic bus...
By JasonH Nov 27, 2024Hey Community! It’s time to share some appreciation and celebrate what we have accomplis...
By JasonH Nov 14, 2024