What is the best structure for paginating comment structures on pages to preserve the maximum SEO juice?
-
You have a full webpage with a great amount of content, images & media.
This is a social blogging site where other members can leave their comments and reactions to the article.
Over time there are say 1000 comments on this page. So we set the canonical URL, and use Rel (Prev & Next) to tell the bots that the next subsequent block of 100 comments is attributed to the primary URL.
Or... We allow the newest 10 comments to exist on the primary URL, with a "see all" comments link that refers to a new URL, and that is where the rest of the comments are paginated.
Which option does the community feel would be most appropriate and would adhere to the best practices for managing this type of dynamic comment growth?
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices for Customer Portals
We have a customer portal which is used to display customer's serial numbers, the knowledgebase, support ticketing information and forum. This information is behind a wall as a user must have support in order to view. The question is what are the best practices for SEO with the customer portal? Should we block these sections from bots? Is there a way to take advantage of the number of pages that are within the portal?
Intermediate & Advanced SEO | | ASCI-Marketing0 -
Taxonomy question - best approach for site structure
Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
Intermediate & Advanced SEO | | Bee159
.com/assessment/straightening
.com/treatment/whitening
.com/treatment/straightening or .com/whitening/assessment
.com/straightening/assessment
.com/whitening/treatment
.com/straightening/treatment Please advise, thanks.0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Best practices for robotx.txt -- allow one page but not the others?
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed). What is the recommended best practice for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Google+ Personal Page pass link juice?
I noticed recently that a clients google plus business page (Set up as a personal page) has a followed link pointing to their site. They have many links on the web pointing to the google+ page, however that page is an https page. So the question is, would a google+ page that is https still pass authority and link juice to the site linked in the about us tab?
Intermediate & Advanced SEO | | iAnalyst.com0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Suggest some best seo extensions for prestashop
Hello all if any one of you using prestashop for their ecommerce portal or for their clients... pelase suggest some good addons of prestashop for the SEO. thanks
Intermediate & Advanced SEO | | idreams0 -
SEO-Friendly Method to Load XML Content onto Page
I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!
Intermediate & Advanced SEO | | KaneJamison0