Hi fellow SEO mozzers,
I am getting 'duplicate content' errors when our site is crawled, mainly down to our WordPress blog and how we have handled tags. Currently, they are being crawled and as such are regarded as duplicate pages.
I have read several different articles on how to handle tags. Some suggest noindex the tag URL's.
Others suggest to optimize them and allow them to be indexed since Google has confirmed they won't penalize a WordPress site for having archive pages that publish and point to the same content. It will select the best link to represent the cluster of links.
Over the past few months, nearly 4% of our WordPress traffic have been referred by tag pages listed in search engines.
Initially I was going to noindex the tag pages, but going on the above info I wonder should I leave them as they are?
Or is the issue that having duplicate content will lead to inefficient crawling?
Any views/opinions on how best to handle this?