Robots.txt | How To Disallow Any Page In Robots.txt File

I am facing the problem of duplicate content on my site, as many pages in one collection are crawled by Google. I share a screenshot below to understand about the problem.

Hope you understand the problem!

If I disallow this URL in robots.txt, my problem is solved or not?
For example I disallow this page like this in robots.txt file: Disallow: ?page=

If this is the right way to solve my problem, where I need to disallow URL in robots.txt file. I see multiple user agents in robots.txt, so I’m confused.

I share below screenshot how I disallow in robots.txt file.

I put this at the end of robots.txt file.

Please help me and provide me the right way to solve this problem.

Hi @Bhanupratap
To fix such issue, you can canonicalise pagination links
In theme.liquid, find


and replace it with

{% if template contains 'collection' %}  {% else %}  {% endif %}

I hope my solution is helpful to you.
Thanks!

Hi pawankumar,

will this allow all that is disallowed?