Duplicate content vs. less content
-
Hi, I run a site that is currently doing very well in google for the terms that we want. We are 1,2 or 3 for our 4 targeted terms, but havent been able to jump to number one in two categories that I would really like to.
In looking at our site, I didn't realize we have a TON of duplicate content as seen by SEO moz and I guess google. It appears to be coming from our forum, we use drupal. RIght now we have over 4500 pages of duplicate content.
Here is my question: How much is this hurting us as we are ranking high. Is it better to kill the forum (which is more community service than business) and have a very tight site SEO-wise, or leave the forum even with the duplicate content.
Thanks for your help. Erik
-
|
This is from seomoz crawl report. mostly all of it is forum and user profiles.
Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57 Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57-0
|
WA Jer | Surfing Nosara
http://www.surfingnosara.com/users/wa-jer wavekitegirl | Surfing Nosara
http://www.surfingnosara.com/users/wavekitegirl White Abbot | Surfing Nosara
http://www.surfingnosara.com/users/white-abbot |
|
-
The -0 is Drupal's way of handling duplicate page titles, duplicate file names etc. You may indeed have an issue where two "nodes" are being generated. If this is the case you are basically creating a competitor for yourself.
Do you want to share the site and two URL's that are duplicated?
-
Thank your for the responses. We have a popular ride board that is awesome, and some buy and sell... other than that most of our forum has moved to our Facebook page. About 1/3 of the duplicate content has a -0 after the title. I am not sure how to take it out from the robots.txt file.
i guess the heart of my questions is I have always thought that all the content from the forum would help us in SEO. Is it possible that it is really hurting us? How does google look at a site that has a ton of pages ( we are an old site that keeps all of our content so folks can search it, old surf reports...) but a ton of errors and duplicate content. How much will solving errors and duplicate content help vs. just working on more links? Where do i focus energy?
-
Don't take down a forum if it has an active community, instead focus on the canonical control of the forum. Depending on your Drupal set-up this could be tricky to implement. That being the case then you could always block Googlebot's access to the duplicated pages then remove the URLs from the index through GWT. The last option would be to review your Drupal template and insert a PHP conditional statement to issue a noindex,follow command in the robots meta tag, for certain pages. Look at the URLs and see if there's a pattern as to which pages are the 'duplicates' and try to match that pattern. Hope something here helped.
-
_I am assuming that you are referring internal duplicate content issue. In that case, I would rather suggest you to fix them by adding canonical or by adding noindex Meta data or by specifying some rules in robots.txt file. No need to remove the forum if it is adding value to user experience. However, if you feel that your forum is getting ruled by spammers and trolls, you should take down the whole thing. I hope that it would do good to your website in the long run. _
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
Is anyone using Canonicalization for duplicate content
Hi i am trying to find out if anyone is using Canonicalization for duplicate content on a joomla site. I am using joomla 1.5 and trying to find either a module or manually how to sort this out as i have over 300 pages of duplicate content because i am not using this technique any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Whats with the backslash in the url adding as duplicate content?
Is this a bug or something that needs to be addressed? If so, just use a redirect?
Technical SEO | | Boogily0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0