Best way to Handle Pagination?
-
At the moment I my blog is paginated like so:
/blogs > /blogs/page/2 > /blogs/page/3 etc
What are the benefits of paginating with dynamic URLs like here on SEOmoz with /blog?page=3
-
No I meant use /blogs/ as the first page and /blogs/pX for the next pages, X being the pagination number. These pages are valid and are not 301 of course.
BUT, /blogs/p1 is the same as /blogs/ so you should 301.
AND you must be aware of inexistent pages in the pagination (p10000 because you don't have 10000 pages of paginated results; /blogs/p01 or /blogs/p02 because these pages should not exist)
-
I wouldn't be comfortable 301'ing those pages like you said. I want my users to be able to navigate through earlier posts rather than just being redirected to the blog homepage. Perhaps you mean rel canonical redirect?
-
This won't make much difference, I usually use these urls though :
/blogs/
/blogs/p2
Remember to 301 /blogs/p1 to /blogs/ and to 404 pages with a page too big /blogs/p10000 or strange urls /blogs/p01
-
I was wondering the same thing and tried it in multiple ways. I got similar results SEO wise, I don't think it ultimately matters. What matters is how it looks to your users and a proper sitemap.
-
Thanks Dan. I would robots.txt the pages but I still want the links on those pages to flow PR - so I really need to noindex,follow them but WordPress is being difficult!
As for SEO differences, you're probably right. I think I just prefer the look for dynamic URLs than /dir/page
-
I don't think it makes a difference either way. One advantage to your current pagination is that if you want to block those pages you can robots.txt block the /page/ directory and that handles that. Not sure how SEOmoz would go about it with those dynamic URLs. From an SEO perspective I don't think it matters either way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google serp pagination issue
We are a local real estate company and have landing pages for different communities and cities around our area that display the most recent listings. For example: www.mysite.com/wa/tumwater is our landing page for the city of Tumwater homes for sale. Google has indexed most of our landing pages, but for whatever reason they are displaying either page 2, 3, 4 etc... instead of page 1. Our Roy, WA landing page is another example. www.mysite.com/wa/roy has recently been showing up on page 1 of Google for "Roy WA homes for sale", but now we are much further down and www.mysite.com/wa/roy?start=80 (page 5) is the only page in the serps. (coincidentally we no longer have 5 pages worth of listings for this city, so this link now redirects to www.mysite.com/wa/roy.) We haven't made any major recent changes to the site. Any help would be much appreciated! *You can see what my site is in the attached image... I just don't want this post to show up when someone google's the actual name of the business 🙂 nTTrSMx.jpg C4mhfgh.jpg
Technical SEO | | summithomes0 -
Best practices for lazy loading (content)
Hi all, We are working on a new website and we want to know the best practices for lazy loading of google for content.
Technical SEO | | JohnPalmer
My best sample is: bloomberg.com , look at their homepage. Thank y'all!0 -
What is the best way to deal with https?
Currently, the site I am working on is using HTTPS throughout the website. The non-HTTPS pages are redirected through a 301 redirect to the HTTPS- this happens for all pages. Is this the best strategy going forward? if not, what changes would you suggest?
Technical SEO | | adarsh880 -
Affiliate Link is Trumping Homepage - URL parameter handling?
An odd and slightly scary thing happened today: we saw an affiliate string version of our homepage ranking number one for our brand, along with the normal full set of site-links. We have done the following: 1. Added this to our robots.txt : User-agent: *
Technical SEO | | LawrenceNeal
Disallow: /*? 2. Reinserted a canonical on the homepage (we had removed this when we implemented hreflang as had read the two interfered with each other. We haven't had canonical for a long time now without issue. Is this anything to do with the algo update perhaps?! The third thing we're reviewing I'm slightly confused about: URL Parameter Handling in GWT. As advised - with regard to affiliate strings - to the question: "Does this parameter change page content seen by the user?" We have NO selected, which means they should be crawling one representative URL. But isn't it the case that we don't want them crawling or indexing ANY affiliate URLs? You can specify Googlebot to not crawl any of particular string, but only if you select: "Yes. The parameter changes the page content." Should they know an affiliate URL from the original and not index them? I read a quote from Matt Cutts which suggested this (along with putting a "nofollow" tag in affiliate links just in case) Any advice in this area would be appreciated. Thanks.0 -
Best Implementation of a Title Tag
If My Targeted keyword are: Mussoorie Hotels Hotels in Mussoorie Mussoorie Resorts Resorts in Mussoorie What of the below 3 will be the best Title Tag After Panda and Penguine ? Hotels and Resorts in Mussoorie Mussoorie Hotels | Mussoorie Resorts | Luxury Budget & Economical Accommodation in Mussoorie Mussoorie Hotels, Mussoorie Resorts, Hotels in Mussoorie, Resorts in Musoorie please suggest!
Technical SEO | | WildHawk0 -
What is the best way to fix legacy overly-nested URLs?
Hi everyone, Due to some really poor decisions I made back when I started my site several years ago, I'm lumbered with several hundred pages that have overly-nested URLs. For example: /theme-parks/uk-theme-parks/alton-towers/attractions/enterprise I'd prefer these to feature at most three layers of nesting, for example: /reviews/alton-towers/enterprise Is there a good approach for achieving this, or is it best just to accept the legacy URLs as an unfixable problem, and make sure that future content follows the new structure? I can easily knock together a script to update the aliases for the existing content, but I'm concerned about having hundreds of 301 redirects (could this be achieved with a single regular express in .htaccess, for example?). Any guidance appreciated. Thanks, Nick
Technical SEO | | ThemeParkTourist0 -
Mobile site: robots.txt best practices
If there are canonical tags pointing to the web version of each mobile page, what should a robots.txt file for a mobile site have?
Technical SEO | | bonnierSEO0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1