How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge number of crawl anomalies and 404s - non- existent urls
Hi there, Our site was redesigned at the end of January 2020. Since the new site was launched we have seen a big drop in impressions (50-60%) and also a big drop in total and organic traffic (again 50-60%) when compared to the old site. I know in the current climate some businesses will see a drop in traffic, however we are a tech business and some of our core search terms have increased in search volume as a result of remote-working. According to search console there are 82k urls excluded from coverage - the majority of these are classed as 'crawl anomaly' and there are 250+ 404's - almost all of the urls are non-existent, they have our root domain with a string of random characters on the end. Here are a couple of examples: root.domain.com/96jumblestorebb42a1c2320800306682 root.domain.com/01sportsplazac9a3c52miz-63jth601 root.domain.com/39autoparts-agency26be7ff420582220 root.domain.com/05open-kitchenaf69a7a29510363 Is this a cause for concern? I'm thinking that all of these random fake urls could be preventing genuine pages from being indexed / or they could be having an impact on our search visibility. Can somebody advise please? Thanks!
Technical SEO | | nicola-10 -
Number of index pages in web master is different from site:mydomainname
Google says one to discover whether my pages is index in Google is site:domain name of my website: https://support.google.com/webmasters/answer/34444?hl=enas mention in web page above so basically according to that i can know totally pages indexed for my website right:it shows me when type (site:domain name ) 300 but it says in Google web master that i have 100000so which is the real number of index page 300 or 1000000 as web master says and why i get 300 when using site:domain name even Google mention that it is way to discover index paged
Technical SEO | | Jamalon0 -
Why is there a difference in the number of indexed pages shown by GWT and site: search?
Hi Moz Fans, I have noticed that there is a huge difference between the number of indexed pages of my site shown via site: search and the one that shows Webmaster Tools. While searching for my site directly in the browser (site:), there are about 435,000 results coming up. According to GWT there are over 2.000.000 My question is: Why is there such a huge difference and which source is correct? We have launched the site about 3 months ago, there are over 5 million urls within the site and we get lots of organic traffic from the very beginning. Hope you can help! Thanks! Aleksandra
Technical SEO | | aleker0 -
Best practice around removing large section of the website
We are looking at removing a large section of our website that is getting low/no traffic. My current thought of removing this would be to delete the pages and add 301 redirects to a similar page within the site that is not being deleted. This will be removing 400+ pages, does it this make sense? Or should we point them to the homepage? Finally should we do this in one batch or should we slowly remove the pages over the course of a couple weeks. Thanks - appreciate the help in understanding the best practice in terms of SEO.
Technical SEO | | webactive0 -
301 Redirects on Large Real Estate Website
Hi guys,We are about to move over to a new website and need advice on handling the 301 redirects.We have a large real estate website with around 12,000 pages, a lot of these are properties (about 10,000)On our old website, the url structure for each property is as follows -domainname.com/property/view?property=14863on our new site, the url structure isdomainname.com/properties/view/6137The property ID number is always different from old site to new. The way we see it, we have two options. a.) a manual redirect of each and every property url. A very very long jobb.) a folder level redirect, so redirect the 'property' folder on the old site into the 'properties' folder on new. The con with this one is we are not sure if this is the best route to take, if it is how we would go about it?Some advice would be really appreciated guys. I know there are some hyper intelligent SEO's in here and we need to make sure we handle this right!Many thanks in advance.Mark
Technical SEO | | Nextman0 -
Our sites have a high number of long urls. how does this affected ranking
Hi, A few of the sights in our networks have a high number of urls. How does this affect our rankings Thanks in advance for your help
Technical SEO | | Feily0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0 -
How to handle Not found Crawl errors?
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow. http://www.vistastores.com/404 But, I have question about it. Will it require to set 301 redirect for broken links which found in Google webmaster tools?
Technical SEO | | CommercePundit0