I am facing the problem of duplicate content on my site, as many pages in one collection are crawled by Google. I share a screenshot below to understand about the problem.
Hope you understand the problem!
If I disallow this URL in robots.txt, my problem is solved or not?
For example I disallow this page like this in robots.txt file: Disallow: ?page=
If this is the right way to solve my problem, where I need to disallow URL in robots.txt file. I see multiple user agents in robots.txt, so I’m confused.
I share below screenshot how I disallow in robots.txt file.
I put this at the end of robots.txt file.
Please help me and provide me the right way to solve this problem.
