Forum software penalties
-
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages.
These pages include...
- Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links).
- User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages.
- User lists which can have several hundred pages.
I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed?
Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count).
What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate.
Does too much indexable content that is unique dilute your good content?
-
If you a bit of it on a well established site with many highly trusted links, It may have a postive effect. If you did lots of it on a brand new site, It may do more harm than good.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we delete all the outgoing links from a forum at once?
Hi all, We have our forum and is filled with spammy content and external nofollow links. I am just wondering if we can make all these links as plain text from hyperlinks (if we can); it's technically deleting all the external links. Will this impact negatively or positively anyhow as per Google? Please share your ideas. Thanks
Algorithm Updates | | vtmoz0 -
Should I move my forum to a subdomain?
My forum causes a lot of 403, 404, soft 404 and 522 errors. I worry about this dragging down the value of my domain and wonder if I should move it to a sub directory. forum.domain.com. I was forced to do this with a very similar site and seems to have not suffered any google penalty (I implemented a 301 redirect to each page to its corresponding page on the subdomain.
Algorithm Updates | | 321Chat0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Software to indicate search volumes/trends
Is there any software that exists to help provide an indication of the search volumes or trends behind keywords? Currently we use google trends for a YOY and WOW view for some select KW's but the reality is are search volumes are effected by 1,000's of KW's and trying to get an accurate view of overall trends and theme with our product sector is difficult. I'm interested to find out if there is software which is an alternative to google trends or if there is 3rd party software available that lets us process google trends data in bulk?
Algorithm Updates | | SEO-SMB0 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Title Tags and Over Optimization Penalty
In the past, it was always a good thing to put your most important keyword or phrase at the beginning of the Title Tag with the company name at the end. Now according to the over optimization penalty in the Whiteboard Friday video, it seems to be better to be more human and put the company name at the beginning with the keyword or phrase following. Am I understanding this correctly?
Algorithm Updates | | hfranz0 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0 -
Penalty or Algorithm hit?
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
Algorithm Updates | | askthetrainer0