Content in forum signatures being spidered, does it matter?
-
Hello,
first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions.
The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example
1968 Car A - 1987 Car B - 1998 Car D and so on....
These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes.
What I'm noting is
a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess.
and of more interest
b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search.
So what is the best approach?
Leave as is? Hide the signatures from the BOTs? Another approach?
-
On reflection I've taken the suggested approach of using the nocontent tags for CSE and ensured all signature links are nofollow.
Once again ensured bots can see the signatures due to slight concern about cloaking penalty.
Thanks for your feedback.
-
Rutteger,
If that forum template really removes signatures ONLY for bots, then yes that is cloaking. I wouldn't do that.
The info above was for solving the internal site search problem only, not for Google web search.
However, the additional tips I provided should help with web search. Other than that I wouldn't be too terribly worried about it.
-
Thanks for taking the time to respond.
Although not 100% clear I'm assuming this only applies to custom search rather than google proper?
Never put much time into SEO over the years, always hoped google would figure stuff out. Signature content is defintely being indexed.
For the moment taken the approach of one of the 'SEO' forum templates of not showing signatures to the BOTs. Will see how it goes. Bit reluctant to do this because slightly worried it might be seen as cloaking but hope it'll work out long term...
-
Hello Rutteger,
Regarding SiteSearch, this is from Google Support:
Exclude boilerplate content
"If your pages have regions containing boilerplate content that's not relevant to the main content of the page, you can identify it using the
nocontent
class attribute. When Google Custom Search sees this tag, we'll ignore any keywords it contains and won't take them into account when calculating ranking for your Custom Search engine... To use thenocontent
class attribute, you'll need to tag the boilerplate content, and then modify your context file. This tells Google that you're using thenocontent
class attribute."Read the rest here:
http://support.google.com/customsearch/bin/answer.py?hl=en&answer=2631036I think it is a great idea for you to have members use that field to showcase their collections in this way. It keeps the signature content relevant (at least at the forum level, if not for the thread) and increases internal linking.
Additionally, I would do the following:
- Limit signature privileges to members with a certain number of posts and/or other metrics (e.g. Kudos, points...)
- Do not allow external linking from forum signatures
I will leave this question open as a discussion in case anyone else has first-hand experience with traffic / ranking changes before and after removing signatures from a forum - or with handling the situation in some other way.
Lastly, Google is pretty good at recognizing boilerplate content. They have been dealing with forum signatures for years, and since the area is highly-prone to spam I would imagine they know what the signatures are on most major forum platforms. Thus, I wouldn't fret over it too much unless it is clearly causing you problems in the SERPs.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, which seems not to be duplicate :S
After crawling I am used to getting a lot of duplicate content messages in Moz, which are High Priority. I do not know what to do with them, since I believe we tackled all the issues. Main point being the advise to put in a link rel=canonical. An example of a page that accordeing to the report has a duplicate. I do not see how. Can you help with that? http://www.beat-it.nl/4y6hctr24x7wdmr-ml350-p-ic-procaresvc.html duplicate sample http://www.beat-it.nl/modu-hp-a5800-acm-for-64-256-aps.html
On-Page Optimization | | Raymo0 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
Google showing my content on the serps in a different domain
Hi all, Recently a partner of ours discovered that Google is showing a meta description on the serps for his homepage that is not his but ours. On his site, he sells add-ons for our software, so the name of our software appears many times and as well there are many links pointing to our site. He claims he hasn´t copied this text from us, and I have used some tools to verify this. I don´t understand how Google can get confused and show our text as the meta desctiption on the serps for his homepage. Any idea on why this happened?
On-Page Optimization | | Paessler0 -
Duplicate content on ecommerce
We have a website that we created a little over a year ago and have included our core products we have always focused on such as mobility scooters and power wheelchairs. We have been going through and updating product descriptions, adding product reviews that our customers have provided etc in order to improve on our SEO rankings and not be penalized by the Panda update. We were approached by a manufacturer last year about their products and they had close to 10k products that we were able to upload easily into our system. Obviously these all have standard manufacturers descriptions many sites are also using. It will take us forever to go through and change all of these and many products are similar to each other anyway they just vary in size, color etc. Will it help our rankings for our core products to simply go through and delete all of these additional products and categories and just add them one by one with unique descriptions and more detailed information when we have time? We aren't really selling many of them anyway so it won't hurt our sales. I'm clearly new to SEO and any help at all would be greatly appreciated. My main website is www.bestmedicalsuppliesonsale dot com A sample core category that we have changed descriptions for is http://www.bestmedicalsuppliesonsale.com/mobility-scooters-s/36.htm A sample of a category and products we simply uploaded would be at http://www.bestmedicalsuppliesonsale.com/Wound-Care-s/4837.htm I'm open to all suggestions I would just like to see my traffic and obviously sales increase. If there are any other glaring problems please let me know. I need help!
On-Page Optimization | | BestMedical0 -
Meta descriptions better empty or with duplicate content?
I am working with a yahoo store. Somehow all of the meta description fields were filled in with random content from throughout the store. For example, a black cabinet knob product page might have in its description field the specifications for a drawer slide. I don't know how this happened. We have had a programmer auto populate certain fields to get them ready for product feeds, etc. It's possible they screwed something up during that, this was a long time ago. My question. Regardless of how it happened. Is it better for me to have them wipe these fields entirely clean? Or, is it better for me to have them populate the fields with a duplicate of our text from the body. The site has about 6,500 pages so I have and will make custom descriptions for the more important pages after this process, but the workload to do them all is too much. So, nothing or duplicate content for the pages that likely won't receive personal attention?
On-Page Optimization | | dellcos1 -
Building content pages, redirecting and linking
Previously the company had created some .HTML content pages around top shoe styles and top manufactures. One or two of these pages used to rank but have been neglected over the page 18 months. I want to build out new content round our top styles / top manufactures and I am wondering if I should use the existing HTML pages or create new pages that use our content management system. The .HTML pages can contain keywords in the URL, using our content management system, all URL’s are www.site.com/content/home/contentid=1234abcd. If we use the .HTML pages all content is managed manually. If we build out 6 to 10 pages, this can become a resource issue and may result in a bad experience for the website visitor. From an SEO perspective, does the benefit of having the keywords in the URL outweigh the manual management hassles? And if not, should we 301 all the HTML pages to the new content pages? And from a linking standpoint, I want these content pages to point to the new version of the top style. From a navigation standpoint, we also want to provide access to all styles from the manufacture. Should we nofollow the links to all styles?
On-Page Optimization | | seorunner0 -
What to do with old content in light of the Panda update?
Let's say you operate a laptop review website. After several years, the individual product review URL's (like site.com/dell/xp1234-review/) aren't receiving much traffic, they may have a few links here and there. In general and considering the panda update, would the best option be to 301 the old URL's back to the category page (site.com/dell/)or just keep them where they are? Any potential issues like having excessive 301's which could slow down the site or appear fishy to search engines?
On-Page Optimization | | BryanPhelps-BigLeapWeb0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30