Duplicate Content Issues with Forum
-
Hi Everyone,
I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz.
My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)?
I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin.
Thank you,
Kris
-
Hi Alan,
Thanks for your feedback. In regards to thin content, do you know if there is a certain minimum requirement of textual content that should be on a page? I have read a few articles that recommend having at least 300 words for each page. Do you know if there any validity to this?
I appreciate your input about the forum issues. I will look into adding/updating the robots file to block everything but the main posts.
What are your thoughts on removing bad backlinks and how critical is this for Panda/Pengiun recovery? There are so many areas to work on but only so many hours in the day
Thanks,
Kris
-
Kris,
The ideal scenario is to eliminate or consolidate content that exists on extremely thin pages on a site. When it comes to forums, consolidation is a near impossibility technically and likely to cause problems for usability. As a result of those challenges, the best course of action would to either eliminate the content entirely or block the forums from indexation via robots.txt file.
If there is any potential for any of the forum content being of value from a search engine perspective, it would likely be a case where you'd want to keep the core forum posts indexable but block the member pages.
Unfortunately there's no one answer and no true way to gauge the impact on any action you choose without actually taking that action and then waiting to see what happens.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Mass Duplicate Content
Hi guys Now that the full crawl is complete I've found the following: http://www.trespass.co.uk/mens-onslow-02022 http://www.trespass.co.uk/mens-moora-01816 http://www.trespass.co.uk/site/writeReview?ProductID=1816 http://www.trespass.co.uk/site/writeReview?ProductID=2022 The first 2 duplicate content is easily fixed by writing better product descriptions for each product (a lot of hours needed) but still an easy fix. The last 2 are review pages for each product which are all the same except for the main h1 text. My thinking is to add no index and no follow to all of these review pages? The site will be changing to magento very soon and theres still a lot of work to do. If anyone has any other suggestions or can spot any other issues, its appreciated. Kind regards Robert
On-Page Optimization | | yournetbiz1 -
E commerce Website canonical and duplicate content isssue
i have a ecomerce site , i am just wondering if any one could help me answer this the more info page can be access will google consider it as duplicate and if it does then how to best use the canonical tag http://domain.com/product-page http://domain.com/product-page/ http://domain.com/product-Page http://domain.com/product-Page/ also in zencart when link product it create duplicate page content how to tackle it? many thanks
On-Page Optimization | | conversiontactics0 -
Duplicate Page Content for Product Pages
Hello, We have one website which URL is http://www.bannerbuzz.com & we have many product pages which having duplicate page content issue in SEOMOZ which are below. http://www.bannerbuzz.com/backlit-banners-1.html
On-Page Optimization | | CommercePundit
http://www.bannerbuzz.com/backlit-banners-10.html
http://www.bannerbuzz.com/backlit-banners-11.html
http://www.bannerbuzz.com/backlit-banners-12.html
http://www.bannerbuzz.com/backlit-banners-13.html We haven't any content on these pages, still getting duplicate page content errors for all pages in SEOMOZ. Please help me how can i fix this issue. Thanks,0 -
How Should I Fix Duplicate Content in Wordpress Pages
In GWMT i see google found 41 duplicate content in my wordpress blog. I am using Yoast SEO plugin to avoid those type of duplicates but still the problem was stick.. You can check the screenshot here - http://prntscr.com/dxfjq Please help..
On-Page Optimization | | mamuti0 -
I have one page on my site... but still get duplicate name and content errors.
i have only the index.html page. my domain has a permanent 301 to the root. why am i getting duplicate problems? i only have one page the index .html???
On-Page Optimization | | one4u2see0