Best Way to Break Down Paginated Content?
-
(Sorry for my english)
I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages.
Here are the options I thought of:
1. Break down reviews into multiple pages / URL
http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc...In this case, each page would be indexed by search engines.
- Pros: all the reviews are getting indexed
- Cons: It will be harder to rank for "blue widget review" as their will be many similar pages
2. Break down reviews into multiple pages / URL with noindex + canonical tag
http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc...In this case, each page would be set to noindex and the canonical tag would point to the first review page.
- Pros: only one URL can potentially rank for "blue widget review"
- Cons: Subpages are not indexed
3. Load all the reviews into one page and handle pagination using Javascript
reviews, reviews, reviews
more reviews, more reviews, more reviews
etc...Each page would be loaded in a different
which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!?
- Pros: all the reviews are getting indexed
- Cons: large page size (kb) - maybe too large for search engines?
4. Load only the first page and load sub-pages dynamically using AJAX
Display only the first review page on initial load. I would use AJAX to load additional reviews into the
. It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments.
- Pros: Fast initial loading time + faster loading time for subpages = better user experience
- Cons: Only the first review page is indexed by search engines
=========================================================
My main competitor who's achieving great rankings (no black hat of course) is using technique #3.
What's your opinion?
-
I think you are forgetting that engines are capable of running Javascript just fine, and all the content that is brought via AJAX to be viewable to the user will also be indexed by search engines.
I would certainly go with option 4, it's a standard Today, but have a look at the "pushState", that and address manipulation, that way, your users will be able to access an exact review (say in page 3) by just typing the address: http://www.mysite.com/blue-widget-review-page3 and have by default the page 3 loaded.
If you go this route, you can also put a hidden (css'ed) NEXT PAGE button at the end to link to the next page.Hope that helps!
-
Check out these pages by Google that talk about Pagination: https://support.google.com/webmasters/answer/1663744?hl=en and http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
In your case, the best way would be to use rel="next" and rel="prev" tags.
Are you using wordpress? If so, the Yoast plugin will take care of this for you.
Howard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Managing Zendesk content, specifically redirecting/retiring content?
We've been using Zendesk to manage support content, and have old/duplicate articles that we'd like to redirect. However, Zendesk doesn't seem to have a solution for this, and the suggestions we've found (some hacky JS) have not worked. I'd like for us to not just delete/hide these articles. Has anyone else successfully navigated retiring/redirecting Zendesk content in an SEO-friendly fashion?
Technical SEO | | KMStrava0 -
Best way to implement noindex tags on archived blogs
Hi, I have approximately 100 old blogs that I believe are of interest to web browsers that I'd potentially like to noindex due to the fact that they may be viewed poorly by Google, but I'd like to keep on our website. A lot of the content in the blogs is similar to one another (as we blog about the same topics quite often), which is why I believe it may be in our interests to noindex older blogs that we have newer content for on more recent blogs. Firstly does that sound like a good idea? Secondly, can I use Google Tag Manager to implement noindex tags on specific blog pages? It's a hassle to get the webmaster to add in the code, and I've found no mention of whether you can implement such tags on Tag Manager on the usual SEO blogs. Or is there a better way to implement noindex tags en masse? Thanks!
Technical SEO | | TheCarnage0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Remotely Loaded Content
Hi Folks, I have a two part question. I'd like to add a feature to our website where people can click on an ingredient (we manufacture skin care products) and a tool-tip style box pops up and describes information about the ingredient. Because many products share some of the same ingredients, I'm going to load this data from a source file via AJAX. My questions are: Does this type of remotely-fetched content have any effect on how a search engines views and indexes the page? Can it help contribute to the page's search engine ranking? If there are multiple pages fetching the same piece of remotely-fetched content, will this be seen as duplicated content? Thanks! Hal
Technical SEO | | AlabuSkinCare0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Worpress Tags Duplicate Content
I just fixed a tags duplicate content issue. I have noindexed the tags. Was wondering if anyone has ever fixed this issue and how long did it take you to recover from it? Just kind of want to know for a piece of mind.
Technical SEO | | deaddogdesign0