Content in forum signatures being spidered, does it matter?
-
Hello,
first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions.
The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example
1968 Car A - 1987 Car B - 1998 Car D and so on....
These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes.
What I'm noting is
a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess.
and of more interest
b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search.
So what is the best approach?
Leave as is? Hide the signatures from the BOTs? Another approach?
-
On reflection I've taken the suggested approach of using the nocontent tags for CSE and ensured all signature links are nofollow.
Once again ensured bots can see the signatures due to slight concern about cloaking penalty.
Thanks for your feedback.
-
Rutteger,
If that forum template really removes signatures ONLY for bots, then yes that is cloaking. I wouldn't do that.
The info above was for solving the internal site search problem only, not for Google web search.
However, the additional tips I provided should help with web search. Other than that I wouldn't be too terribly worried about it.
-
Thanks for taking the time to respond.
Although not 100% clear I'm assuming this only applies to custom search rather than google proper?
Never put much time into SEO over the years, always hoped google would figure stuff out. Signature content is defintely being indexed.
For the moment taken the approach of one of the 'SEO' forum templates of not showing signatures to the BOTs. Will see how it goes. Bit reluctant to do this because slightly worried it might be seen as cloaking but hope it'll work out long term...
-
Hello Rutteger,
Regarding SiteSearch, this is from Google Support:
Exclude boilerplate content
"If your pages have regions containing boilerplate content that's not relevant to the main content of the page, you can identify it using the
nocontent
class attribute. When Google Custom Search sees this tag, we'll ignore any keywords it contains and won't take them into account when calculating ranking for your Custom Search engine... To use thenocontent
class attribute, you'll need to tag the boilerplate content, and then modify your context file. This tells Google that you're using thenocontent
class attribute."Read the rest here:
http://support.google.com/customsearch/bin/answer.py?hl=en&answer=2631036I think it is a great idea for you to have members use that field to showcase their collections in this way. It keeps the signature content relevant (at least at the forum level, if not for the thread) and increases internal linking.
Additionally, I would do the following:
- Limit signature privileges to members with a certain number of posts and/or other metrics (e.g. Kudos, points...)
- Do not allow external linking from forum signatures
I will leave this question open as a discussion in case anyone else has first-hand experience with traffic / ranking changes before and after removing signatures from a forum - or with handling the situation in some other way.
Lastly, Google is pretty good at recognizing boilerplate content. They have been dealing with forum signatures for years, and since the area is highly-prone to spam I would imagine they know what the signatures are on most major forum platforms. Thus, I wouldn't fret over it too much unless it is clearly causing you problems in the SERPs.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Taking more than a day to index after the content changed
Hi everyone, As i got stuck with the confusion that - one of our website pages for the business located in Sharjah contents has been moderated and inspected the URL to google for index with new tags and contents. This is the URL which made the changes: https://www.socprollect-mea.com/sharjah-free-zone-company-registration/ and As i came to know that our page reflecting an issue "Valid items with warnings" once after inspecting the URL in the search console. Something which seems interesting and never experienced before - which is showing: "Products"warning - something like that. I came to know that - Missing field "Brand" and showing no global identifier. Does anybody know what it is and can u able to rectify this concern and get me a solution to index our URL faster on Google search. please?
On-Page Optimization | | nazfazy0 -
Dynamic links & duplicate content
Hi there, I am putting a proposal together for a client whose website has been optimised to include many dynamic links and so there are many pages with duplicate content: only the page title, h1 and URL is different. My client thinks this isn't an issue. What is the current consensus on this? Many thanks in advance.
On-Page Optimization | | lorraine.mcconechy0 -
SEO and dynamic content
I am working on a project right now and I am looking for some advice on the SEO implications. The site is an e-commerce site and on the category pages it is using an external call to retrieve the products after the page is loaded. How it works is all content on the site is loaded, then after that a js script appends an ID and loads all of the product information. I am unsure how Google will see this, anyone have any insights?
On-Page Optimization | | LesleyPaone0 -
Magento Duplicate Content Question - HELP!
In Magento, when entering product information, does the short description have to be different than the meta description? If they are both the same is this considered duplicate content? Thanks for the help!!!
On-Page Optimization | | LeapOfBelief0 -
Is there a way to tell Google a site has duplicated content?
Hello, We are joining 4 of our sites, into 1 big portal, and the content from each site gonna be inside this portal and sold as a package. We don't wanna kill these sites we are joining at this moment, we just wanna import their content into the new site and in a few months we will be killing them. Is there a way to tell Google to not consider the content on these small sites, so the new site don't get penalised? Thanks,
On-Page Optimization | | darkmediagroup0 -
SeoMoz Pro - Keyword stuffing but its a forum
hiya guys Just going through the Seomoz Pro on page thing One of the flagged items is keyword stuffing, I have 22 keywords (stuffed) but they want it down to 15, the thing is, my website is a forum. and the keyword stuffing is from the title name of each categories eg its a nightlife forum so Birmingham nightlife forum Brighton nightlife forum Leeds nightlife forum Manchester nightlife forum u get the picture To avoid this keyword stuffing, should I change some of the forum names to eg leeds nightlife forum > leeds nightlife forums? just by adding the 's that will half the keywords, but its not solving the problem in the long run, if i was to add more cities and areas with the XXXX nightlife foru. What you reckon guys? Cheers guys Luke
On-Page Optimization | | Lukescotty0 -
Localised content/pages for identical products
I've got a question about localising the website of a nationwide company. We're a small dance school with nationwide (40 cities) coverage for around 40 products. Currently, we have one page for each product (style of dance), and one page for each city; the product pages cover keywords like 'cheerleading dance class' while the city pages target the 'london dance classes'-type keywords. To make 'localised product pages', I feel like we should make a page for every city/product combo 'London cheerleading classes' - but that seems like a nightmare for both writing sexy & original content, and link building/social stats. The other thing I can think of (which I refuse to do because it would look stupid & flag the page as keyword stuffed) is filling the page with the keyword phrases which are appropriate for every city. Is there another way to let google know 'this page is appropriate for these cities...'? We do currently list the cities a product is available in, but it doesn't seem to help local rankings very much. Would this just be a link building job, using hyper-targeted anchor texts (inc. city names) for each product? How do the pro's tackle this problem?
On-Page Optimization | | AlecPR0 -
How much constitutes duplicate content in your opinion?
Mornin' In your experience, how much constitutes duplicate content? A sentence, a paragraph, half a page, etc? What about quotes - are they considered duplications, too, if there aren't quotation marks? Over the years, the client has been a bit bad in taking a paragraph from here, a sentence from there, and coupling it all together as daily news on their site. I'm now in the middle of a purge. Oh boy! All hail originality.
On-Page Optimization | | Martin_S0