How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle SEO redirection after remove www
Recently our https://bestclassifiedsusa.com free classifieds USA website we move from www to non www format. Now old urls still showing in Google SERP. When click they show 404 error pages due to removed old ads.
Technical SEO | | bcuclassifieds
We redirect www to non www version for main domain but still old cache pages affected.
How to dynamically handle this case, all old www version ads pages auto redirect to relevant category pages?0 -
Sizes and numbers in friendly urls - syntax
Ok, I'm trying to establish some business rules of syntax for SEO friendly URLS. I'm doing this for an OpenCart online store which uses a SEO-url field to construct the "friendly URL's". The good news of that is I have total control over the urls' the bad news is I had to do some tricky Excel work to populate them. That all said, I have a problem with items that have sizes. This is a crafts store so many of the items are differentiated by size. Examples: Sleigh Bells, come in 1/2", 3/4", 1", 1 1/2" etc. So far Ive tried to stay away from inch mark " by spelling it out. Right now its inch but could be in. The numbers, fractions, sizes etc. create some ghastly friendly URL's. Is there any wisdom or syntax standards out there that would help me. I'm trying to avoid this: www.mysite.com//index.php?route=craft-accessories/bells/sleigh-bells/sleigh-bells-1-one-half-inch-with-loop I realize that the category (sleigh-bells) is repeated in the product name but there are several 1 1/2" items in the store. Any thoughts would be useful, even if it's links to good SEO sites that have mastered the myriad of issues with dimensions in the urls. thanks
Technical SEO | | jbcul0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Wordpress 4xx errors from comment re-direct
About a month ago, we had a massive jump in 4XX errors. It seems the majority are being caused by the comment tool on wordpress, which is generating a link that looks like this "http://www.turnerpr.com/blog/wp-login.php?redirect_to=http%3A%2F%2Fwww.turnerpr.com%2Fblog%2F2013%2F09%2Fturners-crew-royal-treatment-well-sort-of%2Fphoto-2-2%2F" On every single post. We're using Akismet and haven't had issues in the past....and I can't figure out the fix. I've tried turning it off and back on; I'm reluctant to completely switch commenting systems because we'd lose so much history. Anyone seen this particular re-direct love happen before? Angela
Technical SEO | | TurnerPR0 -
How do I handle duplicate content of the same product in Multiple product categories?
I am building a BigCommerce store for selling framed art. Many of the pieces of art will fall in more than one product category. Let's say I have a framed print of a photograph of a western landscape. This piece of art would fit into these categories; "western", "landscape", and "photography". I would have three pages with duplicate content for just this one framed print. Will google give me less page rank due to this? Can all the link juice be given to just one of the three categories by use of rel=canonical? If so, does anyone know how to do this for a bigcommerce site? I would appreciate any feedback. Thanks, Kelly
Technical SEO | | Kelly_S0 -
What to do with extremely high number of URLs on your site?
Here is the situation: The site has tons of business and personal profiles, the information needed to be categorized as such directories were created in an attempt to keep the URL structure clean - so for example: www.abc.com/product/um/name-here/city-name/state/lastname:3458765 Each profile has a unique ID#, and for some reason there needed to be a category for a user in this case /um/ stands for user name. Webmaster tool steps to resolve state to use an rel=canonical which can be done for that directory /um/ but I am concerned about the bot not being able to find the other pages beyond that directory, like the profile name, city, state associated. So I guess my ultimate question is if I use rel=canonical will the rest of the content not get crawled or indexed as well?
Technical SEO | | TLO0 -
Number of Indexed Pages in Webmaster Tools
My # of indexed pages in Webmaster Tools fluctuates greatly. Compared to the # of URLs submitted (4700), we have 3000 indexed. The other day, all 4700 were indexed. Why does it keep changing? I obviously want all of them indexed right? What can I do to make that happen?
Technical SEO | | kylesuss0 -
Large Drop in Brand-name Traffic
Hey folks, I recently noticed a very large drop in brand-name traffic (in the realm of 50%) which coincided with some SEO changes to target non-brand-name searches. Thing is, the site is still ranking #1 for all the brand-name searches that the traffic drop happened in. (checked terms via SEOmoz toolbar, and also proxy with Bruce Clay's toolbar). Any thoughts on where to start lookingfor the problem or what might have caused it? If this hadn't happened at the same time as other changes, I'd put this down to non-SEO problems.
Technical SEO | | BedeFahey0