Content in forum signatures being spidered, does it matter?
-
Hello,
first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions.
The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example
1968 Car A - 1987 Car B - 1998 Car D and so on....
These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes.
What I'm noting is
a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess.
and of more interest
b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search.
So what is the best approach?
Leave as is? Hide the signatures from the BOTs? Another approach?
-
On reflection I've taken the suggested approach of using the nocontent tags for CSE and ensured all signature links are nofollow.
Once again ensured bots can see the signatures due to slight concern about cloaking penalty.
Thanks for your feedback.
-
Rutteger,
If that forum template really removes signatures ONLY for bots, then yes that is cloaking. I wouldn't do that.
The info above was for solving the internal site search problem only, not for Google web search.
However, the additional tips I provided should help with web search. Other than that I wouldn't be too terribly worried about it.
-
Thanks for taking the time to respond.
Although not 100% clear I'm assuming this only applies to custom search rather than google proper?
Never put much time into SEO over the years, always hoped google would figure stuff out. Signature content is defintely being indexed.
For the moment taken the approach of one of the 'SEO' forum templates of not showing signatures to the BOTs. Will see how it goes. Bit reluctant to do this because slightly worried it might be seen as cloaking but hope it'll work out long term...
-
Hello Rutteger,
Regarding SiteSearch, this is from Google Support:
Exclude boilerplate content
"If your pages have regions containing boilerplate content that's not relevant to the main content of the page, you can identify it using the
nocontent
class attribute. When Google Custom Search sees this tag, we'll ignore any keywords it contains and won't take them into account when calculating ranking for your Custom Search engine... To use thenocontent
class attribute, you'll need to tag the boilerplate content, and then modify your context file. This tells Google that you're using thenocontent
class attribute."Read the rest here:
http://support.google.com/customsearch/bin/answer.py?hl=en&answer=2631036I think it is a great idea for you to have members use that field to showcase their collections in this way. It keeps the signature content relevant (at least at the forum level, if not for the thread) and increases internal linking.
Additionally, I would do the following:
- Limit signature privileges to members with a certain number of posts and/or other metrics (e.g. Kudos, points...)
- Do not allow external linking from forum signatures
I will leave this question open as a discussion in case anyone else has first-hand experience with traffic / ranking changes before and after removing signatures from a forum - or with handling the situation in some other way.
Lastly, Google is pretty good at recognizing boilerplate content. They have been dealing with forum signatures for years, and since the area is highly-prone to spam I would imagine they know what the signatures are on most major forum platforms. Thus, I wouldn't fret over it too much unless it is clearly causing you problems in the SERPs.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
How do i know about my website content quality is good or bad?
According to Google updates, content is the main part of the website ranking, so how do i know about my website content quality...if you have any type of tool for check website content quality please refer to me.
On-Page Optimization | | renukishor0 -
Webmaster tools content keywords conundrum
I'm working on optimising a phrase that is made up of two words. I've noticed in webmaster tools that the two words are listed separately under the content keywords section. This is fine apart from the two words are listed at very different significance levels, 2 and 18. Drilling deeper it shows that both these words have two variants. The word in position 2 occurs 483 times and the word in 18 occurs 60 times. Sadly the phrase is commercially sensitive as I'd like to just be able to share it here but can't. Should I be looking to include the weaker word more frequently on the site? In anchor text? Or is this normal distribution? Would optimising the weaker word risk the wrath of Panda? moz-question.jpg
On-Page Optimization | | Hannahm240 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Duplicate content harms individual pages or whole site?
Hi, One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API). Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books. My question is if the duplicate content taken from Amazon harms only each book page or the whole site. The rest of the site has unique content. Thanks! Enrique
On-Page Optimization | | enriquef0 -
Duplicate content because of content scrapping - please help
We manage brands websites in a very competitive industry that have thousands of affiliate links We see that more and more websites (mainly affiliates websites) are scrapping our brand websites content and it generate many duplicate content (but most of them link to us back with an affiliate link). Our brand websites still rank for any sentence in brackets you search in Google, Will this duplicate content hurt our brand websites ? If yes, should we take some preventive actions ? We are not able to add ongoing UGC or additional text to all our duplicate content and trying to stop those websites of stealing our content is like playing cat and mouse... Thanks for your advices
On-Page Optimization | | Tit0 -
Split testing and dupe content
Hi Everyone, good to be here. I'd like to do split testing in Adwords, currently with a clients site we are selling from a normal site with navigation. The site has about 5 specific products, I want to dupe one of the products and create a funnel without navigation distractions right to checkout. Then A/B test the same product pages in Adwords, one with nav and one without. Will the dupe content be ignored do you think? I'm only slightly concerned as the product pages rank well at the moment.
On-Page Optimization | | eonicWeb0 -
Duplicate Page Content Issue
For one of our campaigns, we have 164 errors for Duplicate Page Content. We have a website where much of the same content lives in two different places on their website. The information needs to be accessible from both areas. What is the best way to tackle this problem? Is there anything that can be done so these pages are not competing against one another? If the only solution is to edit the content on one of the pages, how much of the content has to be different? Is there a certain percentage to go by? Here is an example of what I am referring to: 1.) http://www.valleyorthopedicassociates.com/services/foot-center/preventing-sprains-and-strains 2.) http://www.valleyorthopedicassociates.com/patient-resources/service/foot-and-ankle-center/preventing-sprains-and-strains
On-Page Optimization | | cmaseattle1