Calling all the SEO gurus - trailing slash

Calling all the SEO gurus - trailing slash

Rapture
Visitor
3 0 0

Hey guys, hope we will find an SEO/tech. person who would understand our issue. We’ve noticed that we claimed 2 properties in our search console.

 

 

Both are verified via either “meta” tag and/or domain DNS..I was reading that shopify doesn’t handle this natively well..we’ve noticed that the

  • “raptureworks.fun” has only 1k pages indexed and over 5.4k pages not indexed (4.2k Crawled - currently not indexed) and
  • "https://raptureworks.fun/ “ around 3.7k pages indexed and about 4.3k non indexed ( 1.7k Crawled - currently not indexed & 2,5k Alternate page with proper canonical tag)

 

Also none of them have sitemaps verified - and left with “pending status” for about 2month now. When we tried the "live test url" within search console google can fetch them without any issues. Our question is, should we do anything about the indexation for each property ? Is this a normal occurrence..as logically thinking it seems like google is treating each property as an individual website which I believe is not good from an SEO perspective.

 

Just to add the shopify website is live for approximately 3month.

 

Really appreciate your guidance! Find the images attached.

 

3.jpg

2.jpg

4.jpg

1.jpg

    

 

Replies 3 (3)

ErnestoOrtiz
Tourist
3 1 3

There is no issue from claiming two different properties

Domain property verifications means combining results for all the following combinations of urls:

The URL prefix property on the other hand shows results exclusively for the url prefix you submitted. In this case: https://raptureworks.fun/


What is important to check is the actual list of indexed and not indexed pages that each of those properties report

 

  • check all the pages you want indexed show up as indexed
  • check all the pages you don't want indexed show up as not indexed

Why urls fall into each of those options depends.

For urls you don't want indexed, but are, just place a noindex tag.
For urls that are not indexed but you want indexed you need to try to figure out why they aren't. Do they have a no index tag? Are they blocked by robots.txt. Do they have internal links pointing at them or are they orphan pages or pages with very little internal links and you require lots of clicks to get there. 

 

Check carefully the crawled and currently not indexed pages of both properties. I wouldn't worry the numbers are different at this point. If pages show up there that you want indexed then you have to fix it.

Check carefully the Alternative pages with proper canonical tag. Make sure pages that should have a self referential canonical tag don't show up there. 

You also want to make sure for both lists of urls (alternative pages and crawled but currently not indexed) google isn't crawling irrelevant pages. If google is wasting resources crawling urls that don't offer value they spend less resources crawling the important ones. If google crawls thousands of urls of low value and one hundred valuable urls, that ratio of low value to high value urls will hurt you. The solution is to no index urls without value if they are indexed and then once they aren't block crawling through robots.txt.

Regarding the sitemap, it doesn't mean google can't read it. It just means it hasn't bothered. Try re submitting it if you still have issues. If the urls on the sitemap are already indexed I wouldn't worry for now. The whole point of the sitemap is to help google discover important urls. If google already shows them as indexed then they've discovered them. You do want to get things working in the future but the issues above are more important to address. Fixing them might also lead to fixing this.

 

I hope this helps

 

Rapture
Visitor
3 0 0

Hi Ernesto, and thank you for the explanation. It does make 100% sense what you just said. The situation has now changed and seems like both have equally indexed 3.6k pages..I went through what you said as that was the first thing we noticed both got indexed. My question to you is what if the domain property as well as the one with the trailing slash are showing couple of same pages indexed. Does this then mean they are competing each other which is of course bad for the SEO.

There are no issues detected by google about duplicated content. We also run a full audit of the site via SEMRUSH and no duplicated content found, but do you think it's something we should resolve? Also is there a way we can connect directly as we found your response very technical and to the point.

Once again big thanks!

ErnestoOrtiz
Tourist
3 1 3
It's not a problem if the same url shows up on both GSC properties. Actually the domain property should show all the urls shown by the prefix property + other urls. You should expect to see the same urls on both properties and possibly more urls on the domain property if they exist.

One url showing up on both properties doesn't mean google thinks you've got two different websites. Each property is just a reporting interface. If urls classify to be reported on both, nothing wrong with that.

The duplicate content issue is when your website allows you to load 2 urls with exact or almost the same content on two different indexable and self referential canonical urls where none of them redirect to the other

For example these two url options can be loaded and don't redirect to each other
But the second option has its canonical link pointing to the first option (signaling the first option is the url version you want indexed). So if they both appear in any search console property I would try to find how google is finding the second option. There must be a link to it somewhere for that version of the url to be discovered and I would update that link to go to the right url. But besides that, as long as your internal linking points to the first one (wherever you link to it) you should be ok.

On the other hand these pages are only different because of the products they have. Title, content and everything else is the same.
I would set them up differently.

The trailing slash issue doesn't apply to the home page. These are effectively the same
If you load them you'll see second url redirects to the first one. All good.
 
Regarding semrush audits:
These tech audits are basic. Don't think a 90 or 100% score on these means your site is technically good. For example it won't flag the issue you've got with paginated pages. That's not a huge issue, but these basic audits do miss big ones. Don't rely on them for anything else than basic maintenance tasks (finding broken internal links, 404 pages and things like that).