Hello,
I have a site that has around 300k static pages but each one of these has pagination on it.
I would like to stop Rodger Bot from crawling the paginated pages and maybe even Google.
The paginated pages are results that change daily so there is no need to index them.
What's the best way to prevent them from being crawled?
The pages are dynamic so I don't know the URLs.
I have seen people mention add no follow to the pagination links would this do it? or is there a better way?
Many thanks
Steve