Hey. Ive just been doing some major seo work and asked for site crawl. Google now tells me there are coverage issues that may affect seo. It says "Indexed, though blocked by robots.txt" and then list shopping cart, checkout and legal pages. It says robots.txt is not a good way to tell google not to crawl either. Is this a problem I should fix or ignore? If fix, how?
Ive searched and seen similar posts about this issue but not answers.