How damaging is duplicate content in a forum?
-
Hey all; I hunted around for this in previous questions in the Q&A and didn't see anything. I'm just coming back to SEO after a few years out of the field and am preparing recommendations for our web dev team. We use a custom-coded software for our forums, and it creates a giant swathe of duplicate content, as each post has its own link. For example:
domain.com/forum/post_topic/post1
domain.com/forum/post_topic/post2
...and so on. However, since every page of the forum defaults to showing 20 posts, that means that every single forum thread that's 20 posts long has 21 different pages with identical content. Now, our forum is all user-generated content and is not generally a source of much inbound traffic--with occasional exceptions--but I was curious if having a mess of duplicate content in our forums could damage our ability to rate well in a different directory of the site.
I've heard that Panda is really cracking down on duplicate content, and last time I was current on SEO trends, rel="canonical" was the hot new thing that everyone was talking about, so I've got a lot of catching up to do. Any guidance from the community would be much appreciated.
-
Yes, having duplicate content in your forum could hurt your ability to rank.
And yes, the canonical tag sounds like the best way to deal with your situation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To what extent is content considered unique or duplicate?
I work primarily on classifieds websites and an issue I consistently come across are two or more URLs which have the exact same ad count, due to site structure and the way everything is categorized. An example of such would be with these two pages: https://www.boatshop24.co.uk/motorboats/princess https://www.boatshop24.co.uk/boats-for-sale/princess/power These two have the exact same number of ads- would search engines mark these as duplicate content? Both have different meta descriptions, title tags etc. but essentially the MC is exactly the same. If they are, what would be the best course to remedy the problem? I'm skeptical about using canonical tags as I generally use them for exact duplicate pages.
Technical SEO | | Sayers0 -
Duplicate content - working with CMS constraints
Hi, We use an industry-specific CMS and I'm struggling to figure out how we can fix duplicate content issues. Thankfully, the vendor has agreed to work on 301 vs 302 redirects. However, they aren't currently able to give us the ability to add rel=canonical tags to page headers (we've put it in their "suggestion box" which tends to take a long time, if ever, to materialize). My understanding is that the tag will not be recognized if it's in the body code, correct? (aka the part of the page we can edit from the CMS) Is there anything else I can do?
Technical SEO | | combska0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate content in product listing
We have "duplicate content" warning in our moz report which mostly revolve around our product listing (eCommerce site) where various filters return 0 results (and hence show the same content on the page). Do you think those need to be addressed, and if so how would you prevent product listing filters that appearing as duplicate content pages? should we use rel=canonical or actually change the content on the page?
Technical SEO | | erangalp0 -
Is there ever legitimate near duplicate content?
Hey guys, I’ve been reading the blogs and really appreciate all the great feedback. It’s nice to see how supportive this community is to each other. I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it. However, I’m looking at a scenario that is a little different from what I’ve read about. I’m not sure if we’d get penalized by Google or not. We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing. Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches. While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency. There is only so much you can say about specific lines of insurance. They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites? If so, how do you handle this? We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.
Technical SEO | | mavrick0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0