Best Way to Break Down Paginated Content?
-
(Sorry for my english)
I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages.
Here are the options I thought of:
1. Break down reviews into multiple pages / URL
http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc...In this case, each page would be indexed by search engines.
- Pros: all the reviews are getting indexed
- Cons: It will be harder to rank for "blue widget review" as their will be many similar pages
2. Break down reviews into multiple pages / URL with noindex + canonical tag
http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc...In this case, each page would be set to noindex and the canonical tag would point to the first review page.
- Pros: only one URL can potentially rank for "blue widget review"
- Cons: Subpages are not indexed
3. Load all the reviews into one page and handle pagination using Javascript
reviews, reviews, reviews
more reviews, more reviews, more reviews
etc...Each page would be loaded in a different
which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!?
- Pros: all the reviews are getting indexed
- Cons: large page size (kb) - maybe too large for search engines?
4. Load only the first page and load sub-pages dynamically using AJAX
Display only the first review page on initial load. I would use AJAX to load additional reviews into the
. It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments.
- Pros: Fast initial loading time + faster loading time for subpages = better user experience
- Cons: Only the first review page is indexed by search engines
=========================================================
My main competitor who's achieving great rankings (no black hat of course) is using technique #3.
What's your opinion?
-
I think you are forgetting that engines are capable of running Javascript just fine, and all the content that is brought via AJAX to be viewable to the user will also be indexed by search engines.
I would certainly go with option 4, it's a standard Today, but have a look at the "pushState", that and address manipulation, that way, your users will be able to access an exact review (say in page 3) by just typing the address: http://www.mysite.com/blue-widget-review-page3 and have by default the page 3 loaded.
If you go this route, you can also put a hidden (css'ed) NEXT PAGE button at the end to link to the next page.Hope that helps!
-
Check out these pages by Google that talk about Pagination: https://support.google.com/webmasters/answer/1663744?hl=en and http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
In your case, the best way would be to use rel="next" and rel="prev" tags.
Are you using wordpress? If so, the Yoast plugin will take care of this for you.
Howard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
Best way to implement noindex tags on archived blogs
Hi, I have approximately 100 old blogs that I believe are of interest to web browsers that I'd potentially like to noindex due to the fact that they may be viewed poorly by Google, but I'd like to keep on our website. A lot of the content in the blogs is similar to one another (as we blog about the same topics quite often), which is why I believe it may be in our interests to noindex older blogs that we have newer content for on more recent blogs. Firstly does that sound like a good idea? Secondly, can I use Google Tag Manager to implement noindex tags on specific blog pages? It's a hassle to get the webmaster to add in the code, and I've found no mention of whether you can implement such tags on Tag Manager on the usual SEO blogs. Or is there a better way to implement noindex tags en masse? Thanks!
Technical SEO | | TheCarnage0 -
Index.php duplicate content
Hi, new here. Im looking for some help with htaccess file. index.php is showing duplicate content errors with: mysite.com/index.php mysite.com/ mysite.com ive managed to use the following code to remove the www part of the url: IfModule mod_rewrite.c>
Technical SEO | | klsdnflksdnvl
RewriteCond %{HTTPS} !=on
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^ http://%1%{REQUEST_URI} [R=301,L] but how can i redirect the mysite.com/index.php and mysite.com/ to mysite.com. Please help0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Linking to unrelated content
Hi, Just wanted to know, linking to unrelated content will harm the site? I know linking to unrelated content is not good. But wanted to know weather any chances are there or not. I have a site related to health and the other one related to technology. The technology site is too good having PR 6 and very good strong backlinks. And the health related site has very much tough competition, So i wanted to know may be i could link this health site to technology site to get good link from it. Can you suggest me about it. waiting for your replies...
Technical SEO | | Dexter22387874870 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0 -
Duplicate Content Resolution Suggestion?
SEOmoz tools is saying there is duplicate content for: www.mydomain.com www.mydomain.com/index.html What would be the best way to resolve this "error"?
Technical SEO | | PlasticCards0