Too many internal links?

AlbertBenj
Excursionist
15 0 0

Hello,

 

I have 35000 internal links in Google Search Console, but only 900 pages(sitemaps). The internal links come from global hyperlinks within every page like Home page, all Collections Page, About us, etc... Is this bad for my website SEO?

 

Also, I have the recommended products section within the product page which as you know have a different URL than the real product URL page, for example it has a suffix like: ?pr_prod_strat=collection_fallback&pr_rec_pid=5632520028328&pr_ref_pid=5663403507880&pr_seq=uniform ; which I think activates the "duplicate page" concept. Would it be better to block the links from recommended section in robots.txt in order to avoid duplicate content? Because this is also the reason why I have many pages with the "too long URL" and "too many parameters" warning. 

 

If this is the recommended solution, how could I write that for robots.txt? Or maybe do you know a better solution for this? I heard that you cannot modify the structure of the URL from the recommended products section...

 

Thank you very much. 

0 Likes
MsSkeptical
Excursionist
23 0 8

@AlbertBenj wrote:

Hello,

 

I have 35000 internal links in Google Search Console, but only 900 pages(sitemaps). The internal links come from global hyperlinks within every page like Home page, all Collections Page, About us, etc... Is this bad for my website SEO?


Hi there! To answer this question, yes, having too many internal links will make your site look spammy, thus lowering your SEO rankings. You should really take a look as to why you have so many of them.

 


@AlbertBenj wrote:

 

Also, I have the recommended products section within the product page which as you know have a different URL than the real product URL page, for example it has a suffix like: ?pr_prod_strat=collection_fallback&pr_rec_pid=5632520028328&pr_ref_pid=5663403507880&pr_seq=uniform ; which I think activates the "duplicate page" concept. Would it be better to block the links from recommended section in robots.txt in order to avoid duplicate content? Because this is also the reason why I have many pages with the "too long URL" and "too many parameters" warning. 

 

If this is the recommended solution, how could I write that for robots.txt? Or maybe do you know a better solution for this? I heard that you cannot modify the structure of the URL from the recommended products section...

 


For this one, if you think that the duplicate pages is actually the cause of having too many internal links, you may try to do so. I have no experience about this, but maybe this article could help you out in doing so.

Let me know what happens next!

0 Likes
AlbertBenj
Excursionist
15 0 0

@MsSkeptical wrote:

You should really take a look as to why you have so many of them.

I have a navigation  bar in the header and links in the footer like "Terms and Conditions", those being visible everywhere within the store, thus I have like 600 internal links pointing to Terms and Conditions. In the navigation bar I have a list of collections where you can click to go directly to that collection -> 500-2000 internal links pointed to that collection. Literally, like any online store having a footer and a header with navigation bar.

 

0 Likes

Hi @AlbertBenj

There are two types of internal links - Footer/Header Links and In-Content Links. 

From your questions, I figured that a majority of your internal links are from footer links - if this is the case, you don't need to worry as Google can distinguish Footer/Header Link from Content Link and plus, Content Links matter much more and Footer links are often devalued. 

Regarding the 2nd questions, 

Also, I have the recommended products section within the product page which as you know have a different URL than the real product URL page, for example it has a suffix like: ?pr_prod_strat=collection_fallback&pr_rec_pid=5632520028328&pr_ref_pid=5663403507880&pr_seq=uniform; which I think activates the "duplicate page" concept. Would it be better to block the links from recommended section in robots.txt in order to avoid duplicate content? Because this is also the reason why I have many pages with the "too long URL" and "too many parameters" warning. 

-> You need to check if these URLs have the canonical links pointing to the correct product URL page, if the canonical links are all correct then you don't need to worry about Duplicate content. If not, it's a good idea to block these links in your Robots.txt files. You can use SEO META 1 CLICK - a Google Chrome Plugin to help you review Cannonical URL of any pages. 

Let me know if my answer is of any help, with a thumb up or like Cheers 

 

 

PageFly - Advanced Shopify Page Builder - Empowering 100.000+ active merchants.

Check our Evergreen List of Best Platforms To Sell Online.

Get insightful tips to hit more sales in upcoming BFCM.

Parkerr
New Member
2 0 0

can you help me for my web?

0 Likes
Trevor
Community Moderator
Community Moderator
2951 383 594

Hello!

Just a quick update about the robot.txt file that was referenced in this post. As of today, June 21st, 2021, we have launched the ability to edit the robot.txt file to give merchants more control over the information that is crawled by search engines. You can learn more about how to edit your robot.txt file through our community post here

If you have any questions about the new feature, please do not hesitate to create a new post under our "Techincal QA" board.


Trevor | Community Moderator @ Shopify 
 - Was my reply helpful? Click Like to let me know! 
 - Was your question answered? Mark it as an Accepted Solution
 - To learn more visit the Shopify Help Center or the Shopify Blog