How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove number of products in your Google SERP?
I want to remove number of product appearing in search result. Sharing the screenshot as reference. ecrUriH
Technical SEO | | RaviM0 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Implementing Cannonical & Alternate tags on a large website
Hi There, Our brochureware website consists of a Desktop site (www.oursite.com)and a Mobile website (m.oursite.com). I know I need to implement the alternate tags on the desktop pages and the cannonical tags on the mobile versions. However we have a huge site is there any dynamic way through javascript to have the code be generated or is it something that should be done manually page by page? Below is sample javascript a colleague completed to attempt to dynamically develop the snippet but I am unsure if bots will be able to interpret it: Alternate version: Thanks in advance Phil
Technical SEO | | Phily0 -
Onsite SEO Strategy for a large accommodation site
Hi All I have been thinking about the best strategy for keyword optimisation on a forthcoming accommodation website I am involved with. This may be a bit of a newbie type question, but most of my work has been on considerably smaller sites to date.... Lets say the site will have 1 primary landing page for "Hotels in Bristol" and then 50 pages that are each for a hotel in Bristol. The aim would be for the primary page which will be a browse/search result type page to rank well for the term 'Hotels in Bristol' and other similar terms. If each of the hotel listing pages that have a hotel in Bristol on, have the phrase 'Hotel in Bristol' contained within the title, url, page content, maybe headings/alt tags etc. will the result be that the rank for the site is 'spread too thin' across the domain? Whats the best way to drive all the relevancy and keyword usage on the 50 listing pages, to the primary page such that that is the one that ranks well? And the other pages rank more for the hotel name etc? I guess one way would be to avoid using the words hotels and Bristol in the title/URL etc.. but the natural approach for usability (not SEO) would be to use these words i.e. http://www.newtravelsite.com/hotels/bristol/stgeorgeshotel/ Or would each of the 50 listing pages simply need a followed, anchored link pointing the main landing page? I'm sure there may be a fundamental technique to do this that has alluded me so far, but any help, thoughts or guidance much appreciated! Regards Simon
Technical SEO | | SCL-SEO0 -
What to do with extremely high number of URLs on your site?
Here is the situation: The site has tons of business and personal profiles, the information needed to be categorized as such directories were created in an attempt to keep the URL structure clean - so for example: www.abc.com/product/um/name-here/city-name/state/lastname:3458765 Each profile has a unique ID#, and for some reason there needed to be a category for a user in this case /um/ stands for user name. Webmaster tool steps to resolve state to use an rel=canonical which can be done for that directory /um/ but I am concerned about the bot not being able to find the other pages beyond that directory, like the profile name, city, state associated. So I guess my ultimate question is if I use rel=canonical will the rest of the content not get crawled or indexed as well?
Technical SEO | | TLO0 -
I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
Technical SEO | | DerekM880 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
What's the max number of links you should ever have on a page?
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
Technical SEO | | upper2bits0