Content in forum signatures being spidered, does it matter?
-
Hello,
first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions.
The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example
1968 Car A - 1987 Car B - 1998 Car D and so on....
These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes.
What I'm noting is
a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess.
and of more interest
b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search.
So what is the best approach?
Leave as is? Hide the signatures from the BOTs? Another approach?
-
On reflection I've taken the suggested approach of using the nocontent tags for CSE and ensured all signature links are nofollow.
Once again ensured bots can see the signatures due to slight concern about cloaking penalty.
Thanks for your feedback.
-
Rutteger,
If that forum template really removes signatures ONLY for bots, then yes that is cloaking. I wouldn't do that.
The info above was for solving the internal site search problem only, not for Google web search.
However, the additional tips I provided should help with web search. Other than that I wouldn't be too terribly worried about it.
-
Thanks for taking the time to respond.
Although not 100% clear I'm assuming this only applies to custom search rather than google proper?
Never put much time into SEO over the years, always hoped google would figure stuff out. Signature content is defintely being indexed.
For the moment taken the approach of one of the 'SEO' forum templates of not showing signatures to the BOTs. Will see how it goes. Bit reluctant to do this because slightly worried it might be seen as cloaking but hope it'll work out long term...
-
Hello Rutteger,
Regarding SiteSearch, this is from Google Support:
Exclude boilerplate content
"If your pages have regions containing boilerplate content that's not relevant to the main content of the page, you can identify it using the
nocontent
class attribute. When Google Custom Search sees this tag, we'll ignore any keywords it contains and won't take them into account when calculating ranking for your Custom Search engine... To use thenocontent
class attribute, you'll need to tag the boilerplate content, and then modify your context file. This tells Google that you're using thenocontent
class attribute."Read the rest here:
http://support.google.com/customsearch/bin/answer.py?hl=en&answer=2631036I think it is a great idea for you to have members use that field to showcase their collections in this way. It keeps the signature content relevant (at least at the forum level, if not for the thread) and increases internal linking.
Additionally, I would do the following:
- Limit signature privileges to members with a certain number of posts and/or other metrics (e.g. Kudos, points...)
- Do not allow external linking from forum signatures
I will leave this question open as a discussion in case anyone else has first-hand experience with traffic / ranking changes before and after removing signatures from a forum - or with handling the situation in some other way.
Lastly, Google is pretty good at recognizing boilerplate content. They have been dealing with forum signatures for years, and since the area is highly-prone to spam I would imagine they know what the signatures are on most major forum platforms. Thus, I wouldn't fret over it too much unless it is clearly causing you problems in the SERPs.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content query
I'm currently reauthoring all of the product pages on our site. Within the redesign of all the pages is a set of "why choose us?" bullet points and "what our customers say" bullet points. On every page these bullet points are the same. We currently have 18% duplicate content sitewide and I'm reluctant to push this. The products are similar but targeted at different professions, so I'm not sure whether to alter the text slightly for the bullet points on each page, remove the bullet points entirely or implement some form of canonicalisation that won't impact the profession-specific pages' ability to rank well.
On-Page Optimization | | EdLongley0 -
How google treats my two different domains with the same content ?
I have two internet stores for two different markets but in the same language (English), the same content and the same url (only domains different). They are in different servers one in USA another in UK. Example: sample.com (global) and sample.uk (for UK).
On-Page Optimization | | VaidasLinen
Currently sample.com (7 years old) is doing better but not very very well, sample.uk (2 years old) is rated poorly. My question is if it's possible that google will rank both stores well in the future ? Thanks Vaidas0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Duplicate and thin content - advanced..
Hi Guys Two issues to sort out.. So we have a website that lists products and has many pages for: a) The list pages - that lists all the products for that area.
On-Page Optimization | | nick-name123
b) The detailed pages - that when click into from the list page, will list the specific product in full. On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on. For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result. I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another.. Does anyone have any ideas as to if this is a genuine problem and how you would solve? Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think).. Thanks in advance Nick0 -
Does the positioning of the text on a webpage matter for search engines?
Does the positioning of the text on a webpage matter for search engines? Do you need to place the text at the upperside of a webpage or is at the bottom also a good option?
On-Page Optimization | | HMK-NL0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Do videos count as duplicate content?
If we allow users to embed our videos on their site, would that count as duplicate content? I imagine note, given that Google can't usually 'see' the content of videos, but just want to double check.
On-Page Optimization | | nicole.healthline0 -
How deep should I let my forum be spidered
I run quote a niche website that's been running since late 1999 and over that time I've built up something like 4000 resources which consist of either text articles or image galleries and reviews along side another few thousand news stories relating to the niche interest. On top of the main site I also have a forum which isn't especially optimised for SEO and I was wondering, whilst was cleaning it up, whether anyone has any tips / suggestion / best practices for forum SEO. Because it is all UGC the quality of the posts can be quite weak so I was wondering whether I should block robots completely from the forum, which seems a little harsh, whether I should let the whole forum be spidered (which seems a little excessive and potentially a bad thing) or whether I should restrict things to that only the main index and perhaps one page of topics and their posts be accessible to robots and then nofollow the rest? Any thoughts?
On-Page Optimization | | StevenMapes0