Duplicate content vs. less content
-
Hi, I run a site that is currently doing very well in google for the terms that we want. We are 1,2 or 3 for our 4 targeted terms, but havent been able to jump to number one in two categories that I would really like to.
In looking at our site, I didn't realize we have a TON of duplicate content as seen by SEO moz and I guess google. It appears to be coming from our forum, we use drupal. RIght now we have over 4500 pages of duplicate content.
Here is my question: How much is this hurting us as we are ranking high. Is it better to kill the forum (which is more community service than business) and have a very tight site SEO-wise, or leave the forum even with the duplicate content.
Thanks for your help. Erik
-
|
This is from seomoz crawl report. mostly all of it is forum and user profiles.
Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57 Hobie Singlefin 9'4 Takeda Fish 5'7 | Surfing Nosara
http://www.surfingnosara.com/forum/hobie-singlefin-94-takeda-fish-57-0
|
WA Jer | Surfing Nosara
http://www.surfingnosara.com/users/wa-jer wavekitegirl | Surfing Nosara
http://www.surfingnosara.com/users/wavekitegirl White Abbot | Surfing Nosara
http://www.surfingnosara.com/users/white-abbot |
|
-
The -0 is Drupal's way of handling duplicate page titles, duplicate file names etc. You may indeed have an issue where two "nodes" are being generated. If this is the case you are basically creating a competitor for yourself.
Do you want to share the site and two URL's that are duplicated?
-
Thank your for the responses. We have a popular ride board that is awesome, and some buy and sell... other than that most of our forum has moved to our Facebook page. About 1/3 of the duplicate content has a -0 after the title. I am not sure how to take it out from the robots.txt file.
i guess the heart of my questions is I have always thought that all the content from the forum would help us in SEO. Is it possible that it is really hurting us? How does google look at a site that has a ton of pages ( we are an old site that keeps all of our content so folks can search it, old surf reports...) but a ton of errors and duplicate content. How much will solving errors and duplicate content help vs. just working on more links? Where do i focus energy?
-
Don't take down a forum if it has an active community, instead focus on the canonical control of the forum. Depending on your Drupal set-up this could be tricky to implement. That being the case then you could always block Googlebot's access to the duplicated pages then remove the URLs from the index through GWT. The last option would be to review your Drupal template and insert a PHP conditional statement to issue a noindex,follow command in the robots meta tag, for certain pages. Look at the URLs and see if there's a pattern as to which pages are the 'duplicates' and try to match that pattern. Hope something here helped.
-
_I am assuming that you are referring internal duplicate content issue. In that case, I would rather suggest you to fix them by adding canonical or by adding noindex Meta data or by specifying some rules in robots.txt file. No need to remove the forum if it is adding value to user experience. However, if you feel that your forum is getting ruled by spammers and trolls, you should take down the whole thing. I hope that it would do good to your website in the long run. _
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
#1 rankings on both HTTP and HTTPS vs duplicate content
We're planning a full migrate to HTTPS for our website which is accessible today by both **www.**website.com, **http://**www.website.com as well as **https://**www.website.com. After the migrate the website will only be accessible by https requests and every other request (Ex. www or http) will be redirected to the same page but in HTTPS by 301 redirects. We've taken a lot of precautions like fixing all the internal links to HTTPS instead of HTTP etc. My questions is: What happened to your rankings for HTTP after making a full migrate to HTTPS?
Technical SEO | | OliviaStokholm0 -
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0