Duplicate content vs. less content
-
Hi, I run a site that is currently doing very well in google for the terms that we want. We are 1,2 or 3 for our 4 targeted terms, but havent been able to jump to number one in two categories that I would really like to.
In looking at our site, I didn't realize we have a TON of duplicate content as seen by SEO moz and I guess google. It appears to be coming from our forum, we use drupal. RIght now we have over 4500 pages of duplicate content.
Here is my question: How much is this hurting us as we are ranking high. Is it better to kill the forum (which is more community service than business) and have a very tight site SEO-wise, or leave the forum even with the duplicate content.
Thanks for your help. Erik
-
|
This is from seomoz crawl report. mostly all of it is forum and user profiles.
Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57 Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57-0
|
WA Jer | Surfing Nosara
http://www.surfingnosara.com/users/wa-jer wavekitegirl | Surfing Nosara
http://www.surfingnosara.com/users/wavekitegirl White Abbot | Surfing Nosara
http://www.surfingnosara.com/users/white-abbot |
|
-
The -0 is Drupal's way of handling duplicate page titles, duplicate file names etc. You may indeed have an issue where two "nodes" are being generated. If this is the case you are basically creating a competitor for yourself.
Do you want to share the site and two URL's that are duplicated?
-
Thank your for the responses. We have a popular ride board that is awesome, and some buy and sell... other than that most of our forum has moved to our Facebook page. About 1/3 of the duplicate content has a -0 after the title. I am not sure how to take it out from the robots.txt file.
i guess the heart of my questions is I have always thought that all the content from the forum would help us in SEO. Is it possible that it is really hurting us? How does google look at a site that has a ton of pages ( we are an old site that keeps all of our content so folks can search it, old surf reports...) but a ton of errors and duplicate content. How much will solving errors and duplicate content help vs. just working on more links? Where do i focus energy?
-
Don't take down a forum if it has an active community, instead focus on the canonical control of the forum. Depending on your Drupal set-up this could be tricky to implement. That being the case then you could always block Googlebot's access to the duplicated pages then remove the URLs from the index through GWT. The last option would be to review your Drupal template and insert a PHP conditional statement to issue a noindex,follow command in the robots meta tag, for certain pages. Look at the URLs and see if there's a pattern as to which pages are the 'duplicates' and try to match that pattern. Hope something here helped.
-
_I am assuming that you are referring internal duplicate content issue. In that case, I would rather suggest you to fix them by adding canonical or by adding noindex Meta data or by specifying some rules in robots.txt file. No need to remove the forum if it is adding value to user experience. However, if you feel that your forum is getting ruled by spammers and trolls, you should take down the whole thing. I hope that it would do good to your website in the long run. _
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Duplicate Content from Multiple Sources Cross-Domain
Hi Moz Community, We have a client who is legitimately repurposing, or scraping, content from site A to site B. I looked into it and Google recommends the cross-domain rel=canonical tag below: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html The issue is it is not a one to one situation. In fact site B will have several pages of content from site A all on one URL. Below is an example of what they are trying to accomplish. EX - www.siteB.com/apples-and-oranges is made up of content from www.siteA.com/apples & www.siteB.com/oranges So with that said, are we still in fear of getting hit for duplicate content? Should we add multiple rel=canonical tags to reflect both pages? What should be our course of action.
Technical SEO | | SWKurt0 -
GWT Duplicate Content and Canonical Tag - Annoying
Hello everyone! I run an e-commerce site and I had some problems with duplicate meta descriptions for product pages. I implemented the rel=canonical in order to address this problem, but after more than a week the number of errors showing in google webmaster tools hasn't changed and the site has been crawled already three times since I put the rel canonical. I didn't change any description as each error regards a set of pages that are identical, same products, same descriptions just different length/colour. I am pretty sure the rel=canonical has been implemented correctly so I can't understand why I still have these errors coming up. Any suggestions? Cheers
Technical SEO | | PremioOscar0 -
Duplicate content / title caused by CAPITALS
What is the best way to stop duplicate content warning (and Google classing them as duplicate content), when it is caused by CAPITALS (i.e www.domain.com/Directory & www.domain.com/directory ). I try to always use lower case (unless a place name then i use Capitals for the first letter), but it looks like i have slipped up and got some mixed up and other sites will also be linking to Capitals Thanks Jon
Technical SEO | | jonny5123790 -
How much to change to avoid duplicate content?
Working on a site for a dentist. They have a long list of services that they want us to flesh out with text. They provided a bullet list of services, we're trying to get 1 to 2 paragraphs of text for each. Obviously, we're not going to write this off the top of our heads. We're pulling text from other sources and trying to rework. The question is, how much rephrasing do we have to do to avoid a duplicate content penalty? Do we make sure there are changes per paragraph, sentence, or phrase? Thanks! Eric
Technical SEO | | ericmccarty0 -
SEO with duplicate content for 3 geographies
The client would like us to do seo for these 3 sites http://www.cablecalc.com/ http://www.solutionselectrical.com.au http://www.calculatecablesizes.co.uk/ The sites have to targetted in US, Australia, and UK resoectively .All the above sites have identical content. Will Google penalise the sites ? Shall we change the content completly ? How do we approach this issue ?
Technical SEO | | seoug_20050 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0