Are all duplicate content issues bad? (Blog article Tags)
-
If so how bad?
We use tags on our blog and this causes duplicate content issues.
We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all.
Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings.
Before I do, can you give me some advice around this?
Thanks,
Daniel. -
Thanks David, at this stage I have set up site to no index follow tag pages. Thanks for blog usability feedback. Related posts is next on the list of to do for the blog.
-
I don't think standard tags get used much by visitors. Related posts, especially if acompanied by thumbnail images, perform much better in my experience.
-
Ok thanks guys, at this stage I think this is the way I'll go or as David says just use them to organise content and not display them.
Has anyone else found anything out there, articles, videos, anything on Moz that says Google is smart enough to deal with this so it is a non issue.
Also any thoughts on how important blog tags are these days for usability?
-
Agree 100% with David and Fredrico. Noindex, follow your tag pages.
-
Had the same issue myself, constant duplicate content reported on tag pages as sometimes some tag could have the same content on a page (while paginated).
We decided to no-index the tag pages, not only for the duplicate content issue, but also, they do not provide any extra to search engines, they are intended for users, then why have search engines indexing them? We added a no-index but NOT a nofollow as we WANT the pages they link to to be indexed (posts).
Sure, we lost about 7K indexed pages, but now those that remain are actually then ones that deserve to be there.
I'm with David on this one.
-
If you're using tags internally to help organise content, you could just stop them from appearing on the front end of the site.
The alternative is to keep them on the front end, but to no-index the tag pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I delete all tags and just use my categories to organize content?
My website NorthernCaliforniaHikingTrails.com/blog has 400 or so tags, and it also has an extensive set of categories. I'm thinking about deleting all the tags, but keeping the categories and consolidating them a bit. Is there a significant SEO advantage to having tags in my case? I've seen a few very high-ranking websites actually rank for a tag, but I doubt my site will reach that level. Any help appreciated!
Intermediate & Advanced SEO | | John88990 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Canonical vs noindex for blog tags
Our blog started to user tags & I know this is bad for Panda, but our product team wants use them for user experience. Should we canonizalize these tags to the original blog URL or noindex them?
Intermediate & Advanced SEO | | nicole.healthline0 -
Canonical Not Fixing Duplicate Content
I added a canonical tag to the home page last month, but I am still showing duplicate content for the home page. Here is the tag I added: What am I missing? Duplicate-Content.jpg
Intermediate & Advanced SEO | | InnoInsulation0 -
I have a duplicate content problem
The website guy that made the website for my business Premier Martial Arts Austin disappeared and didn't set up that www. was to begin each URL, so I now have a duplicate content problem and don't want to be penalized for it. I tried to show in Webmaster tools the preferred setup but can't get it to OK that I'm the website owner. Any idea as what to do?
Intermediate & Advanced SEO | | OhYeahSteve0 -
Is publishing a large quantity of content at once a bad idea?
If you plan on doubling the size of your site with original, unique content, is it better to publish it all at once or over a period of time? Is there any penalty for publishing it all at once?
Intermediate & Advanced SEO | | nicole.healthline1 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0