Forum software penalties
-
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages.
These pages include...
- Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links).
- User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages.
- User lists which can have several hundred pages.
I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed?
Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count).
What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate.
Does too much indexable content that is unique dilute your good content?
-
If you a bit of it on a well established site with many highly trusted links, It may have a postive effect. If you did lots of it on a brand new site, It may do more harm than good.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we have any risk or penalty for double canonicals?
Hi all, We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C? Thanks
Algorithm Updates | | vtmoz0 -
Do we need to Disallow profiles from discussions or forums?
Hi, We have a forum where users create different threads like any other community...ex..Moz. Thousands of pages are getting created. New threads and comments are Okay as they have relevant content. We are planning to "Disallow" all profile pages as they do not help with content relevancy and may dilute the link juice with thousands of such profile pages. Is this right way to proceed? Thanks
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Can I only submit a reconsideration request if I have a penalty?
Hey guys, One of the sites I'm looking after took a hit with their rankings (particularly for one keyword that went from 6/7 to 50+) post-Penguin in May. Although, after cleaning-up the link profile somewhat we started to see some slow and steady progression in positions. The keyword that dropped to 50+ was moving upwards in advance of 20. However, a couple of weeks back, the keyword in question took another slide towards 35-40. I therefore wondered whether it would be best to submit a reconsideration request - even though the site did not receive a manual penalty. The website has a DA of 40 which more than matches a lot of the competitor websites that are ranking on first page for the aforementioned keyword. At this stage, I would have expected the site to have returned to its original ranking - four-and-a-half months after Penguin - but it hasn't. So a reconsideration request seemed logical. That said, when I came to go through the process on Webmaster Tools I was unable to find the option! Has it now been removed for sites that don't receive manual penalties?
Algorithm Updates | | Webrevolve1 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Specific Page Penalty?
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me. Checked by using gInfinity extension and searched for the page URL. Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised? The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised. Would appreciate any advice. Thanks
Algorithm Updates | | notnem0 -
Penalty for Mixing Microdata with Metadata
The folks that built our website have insisted on including microdata and metadata on our pages. What we end up with is something that looks like this in the header: itemprop="description" content="Come buy your shoes from us, we've got great shoes."> Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other. Can anyone provide insight on this?
Algorithm Updates | | markcely0 -
Penalty or Algorithm hit?
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
Algorithm Updates | | askthetrainer0