Duplicate Content only an Issue on a Huge Scale?
-
To what extent is duplicate content an issue? We have a support forum with some duplicate content because users ask the same questions. The Moz reports we receive highlights our duplicate content and page title for our support forum as a "big" issue.
I'm unsure to what extent it harms our SEO, and making the support section non-crawable would impair our level of support. It would be nice to know for sure if we should be concerned about this, and if yes, how can we do it differently?
Thanks, I appreciate you help.
-Allan
-
Thanks a million guys, very helpful information.
All the best,
Allan
-
Hi Allan,
This is a typical issue with most of the forums out there and there is nothing that you should be worried about and leave it to the search engines to handle it. Moreover, the duplicate content itself cannot harm you or fetch you a penalty from a search engine like Google. Its the intent behind that causes the actual harm and potentially more dangerous.
Having that said, it does not mean you should engage in content duplication and think you are safe. You should stay away from any or all the deliberate or intentional content duplication with an intent to manipulate rankings. Content duplication when it comes to forums is very common thing and search engines are sensible enough in this regard.
Here is what Google has to say in this regard:
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results."
For more: https://support.google.com/webmasters/answer/66359?hl=en
Hope that helps.
Best regards,
Devanur Rafi.
-
I'm not certain that this kind of duplicate content harms your SEO, but I can tell you what my company did to mitigate this kind of duplicate questions.
We merged threads When questions are exactly the same, we just merge them into one thread. There's no point in having multiple threads when the answer is going to be the same, and having multiple answers in one thread is much more helpful to whoever needs this information in the future
We pre-empted the question with a support or FAQ page
This ended up being a huge part of our help page strategy. When we noticed that people were asking the same question several times, we we would create a new support page with in-depth answers to solve that problem, and suggest that users read this related page before posting their questionWe got rid of the forum... Right now, we don't have a public help forum (though this is something we'll likely open up again in the future, when we have more eyes to dedicate to monitoring the forum!), so what we ended up with is a bunch of in-depth help articles, as well as close monitoring of which popular search strings are not turning up good results, and which questions are popping up very frequently in our support queue.
I hope someone else can come along and answer the duplicate content part of your question, because I would love to know as well!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CTA first content next or Content first CTA next
We are a casino affiliations company, our website has a lot of the same casino offers. So is it beneficial to put the content over the casino offers, then do a CSS flex, reverse wrap, so the HTML has the page content first, but the visual of the page displays the casinos first and the content after? or just the usual i.e image the HTML as content first, and CSS makes offers come first?
On-Page Optimization | | JoelssonMedia0 -
Does using Yoast variables for meta content overwrite any pages that already have custom meta content?
The question is about the Yoast plugin for WP sites. Let's say I have a site with 200 pages and custom meta descriptions / title tags already in place for the top 30 pages. If I use the Yoast variable tool to complete meta content for the remaining pages (and make my Moz issue tracker look happier), will that only affect the pages without custom meta descriptions or will it overwrite even the pages with the custom meta content that I want? In this situation, I do want to keep the meta content that is already in place on select pages. Thanks! Zack
On-Page Optimization | | rootandbranch0 -
Duplicate Page Titles and Duplicate Content
I've been a Pro Member for nearly a year and I am bound and determined to finally clean up all the crawl errors on our site PracticeRange.com. We have 180 errors for Duplicate Page Titles and Duplicate Content. I fixed many of the pages that were product pages with duplicate content. Those product descriptions were edited and now have unique content. However, there remain plenty of the errors that are puzzling. Many of the errors reference the same pages, for example, the Home Page, Login Page and the Search page (our catalog pages).
On-Page Optimization | | AlanWills
In the case of the Catalog Page errors, these type pages would have the same title every time "Search" and the results differ according to category. http://www.practicerange.com/Search.aspx?m=6
http://www.practicerange.com/Search.aspx?m=15 If this is rel=canonical issue, how do I fix it on a search result page? I want each of the different category type pages to be indexed. One of them is no more important than the other. So how would I incorporate the rel=canonical? In the case of the Home Page errors, I'm really confused. I don't know where to start to fix these. They are the result of a 404 error that leads to the home page. Is the content of the 404 page the culprit since it contains a link to the home page? Here are examples of the Home Page type of crawl errors. http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aids/Rubber-Wooden-Tee-Holder.aspx http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aid/Impact-Bag.aspx Thanks , Alan WillsPracticeRange.com0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1 -
Mozbar # issue
Hi, When viewed through the Mozbar page analysis tool my Joomla site is always displaying a bullet point and space character before the site. Is this likely to effect search engine reading of my ? www.clearvisas.com - is the site. Please have look and let me know what you think. If you are a Joomla expert your feedback on my website and this issue will be really welcome. Thanks, Fuad
On-Page Optimization | | Fuad_YK0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0