How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to show number of products in your Google SERP?
I have used to rich snippet to my website & everything is working fine except showing the total number of products listed in the particular category. Check out the screenshot below: aH7XM
Technical SEO | | promodirect0 -
How to Handle Subdomains with Irrelevant Content
Hi Everyone, My company is currently doing a redesign for a website and in the process of planning their 301 redirect strategy, I ran across several subdomains that aren't set up and are pointing to content on another website. The site is on a server that has a dedicated IP address that is shared with the other site. What should we do with these subdomains? Is it okay to 301 them to the homepage of the new site, even though the content is from another site? Should we try to set them up to go to the 404 page on the new site?
Technical SEO | | PapercutInteractive0 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Major drop in visitor numbers?
Hi I have been working on the following site www.espares.co.nz - it is a parts location service . We have just done a massive redesign and addressed some major issues with duplicate content - which is a work in progress. What I am wondering is have we overlooked something as there has been a sudden drop off on the site visitors since 25 September with a drop of nearly 30% of organic search visits? Would appreciate feedback.
Technical SEO | | AllieMc0 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
What code do I need to remove Comments Rss from Wordpress?
Guessing I need to put something in functions.php file to remove site wide the "Comments Rss" link under "Meta" in sidebar?
Technical SEO | | bozzie3110 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0