Tags, Categories, & Duplicate Content
-
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us.
See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts.
We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz.
We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well.
Have you confronted this issue? What did you decide and what were the results?
Thanks in advance!
-
Erica, Thank you for sticking with this and continuing to share your thoughts. It's very helpful and much appreciated!
-
Thanks Erica.
We're deindexing the tags pages for now and going to see what happens. If all goes well, we might deindex the category pages as well.
Thanks!
-
EGOL is definitely correct that those pages can hold a ton of value if a brand/company has the time/resources/bandwidth to optimize them. Most don't, so it's better to noindex than have duplicate and/or thin content category pages. But if you can and will optimize, do it!
-
That makes sense. But I really want to make sure I (and others) understand because of EGOL's earlier referenced comments (June 2011).
"If I kept my category pages out of the search indexes I would be walking away from hundreds of search engine visitors per minute.
Do analytics to see how much traffic is coming into these pages from search, who is linking to them, how much revenue they earn and also consider their future traffic potential.
Its not good to follow generalized advice blindly." and (February 2012) ...
"I have two wordpress blogs and category pages are where most of my search engine traffic enters. Some bring in thousands per month. Most of my post pages bring in very little traffic.
If you are not having any problem with duplicate content at present maybe it would be a good idea to allow indexing of the main page, the post pages and the category pages. They if you do have a duplicate content problem you can remove from the index the pages that bring in the least amount of traffic."
So is the key then, ensuring the category pages contain unique content in addition to whatever else is on the category pages? I would have thought the mere fact that you're creating a unique combination of unique content by the grouping excerpts from identically tagged posts might have been enough. That content would also get updated each time a new post gets published.
I'd appreciate your thoughts on this Erica.
-
You can either choose to deindex pages one by one or deindex the whole subfolder.
Since usually category pages have the same content as or a preview of the content on your other pages, this doesn't affect your long tail traffic as that traffic will go to the other pages. Usually the problem with category pages is that the content's thin or duplicate. Now, you can make content just for category pages and keep them to drive traffic to. I worked in e-commerce pre-Moz and we wanted to rank/land people on category pages, such as women's shirts, and made unique, solid content for those page.
-
This is certainly what we've heard and it's good to hear of a real case where you went from indexing to noindexing. My bet is that we would have the same result, my hope is that we would have an increase over time, and my fear is that we'll have a decrease.
-
We do upgrade our blog twice a week and keep a pretty good spread across our categories and tags.
The hope is that we'll have the boost you mentioned, in long-tail and whatnot, but the fear is that it could hurt us (like Erica mentioned above).
Like Erica, we haven't seen any negativity, but we wonder if we're being affected without even knowing it and by setting them to noindex we could potentially get a boost. It's a dream, we just don't want the opposite to happen.
-
Erica, shouldn't the decision to noindex category pages be done on a case-by-case basis? If the blog has few posts, or if posts aren't updated frequently, then the chance of category pages being viewed as thin increases and it would make sense to noindex them.
If, on the other hand:
- category pages have different content from that of the main blog page;
- the main blog and category pages use excerpts;
- tag, archive and author pages are noindexed;
- and frequent updates;
doesn't it then make a case to index category pages? They can be a rich source of long-tail keywords and therefore a good draw for new entrants to the site as explained in this earlier Q&A post.
-
It's highly recommend that you noindex category, tag, archives, and author pages in WordPress. (I assume you're using WP; though there are many similar blogging platforms out there.) The reason is because these pages come across as thin and/or duplicate content, and you are risking getting hit by Panda. Now that doesn't always happen. My own personal blog had these pages indexed for a very long time, and I didn't have any problems. But I also didn't seen any problems when I did deindex them. But I don't get a ton of traffic, and I'm sure traffic to, popularity of site, and competitive nature all factor into Google's radar.
-
Hi Bradjn, With duplicate content i would go for the use of canonicals. Give the original page and its duplicates the same cannonical url, so searchengines will know what's the original and (most of the time) won't see it as duplicate.
Here some more about duplicate content and also canonicals: http://moz.com/learn/seo/duplicate-content
I can't give you a good answer about using noindex or this wil be a positive change. Did you check in webmastertools about duplicates? or only MOZ?
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240