My view on tags is that they are helpful for humans to find information on a specific topic quickly. Being that they are more beneficial to human visitors than search engines, I have always blocked tag pages from being indexed via robots.txt. I also block categories and feeds from being crawled as well. This may be overkill, but it has worked well for me to avoid duplicate content issues on various blogs that I help manage.
Posts made by dmoore
-
RE: Dupplicate Page Content
-
RE: What's the website that analyzes all local business submissions?
Not sure if I am understanding the question correctly, but are you referring to getlisted.org?