Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Category URL Pagination where URLs don't change between pages
-
Hello,
I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
-
Thank you, Oleg, for your response, and thanks, Matt, for jumping in.
The response was definitely very informative, but I'm still on the fence about this issue because, while Oleg confirmed what I had thought, that #2 would be the better choice, I now want to ensure that all the products on pages past page 1 will be crawlable by search engines. Any scenarios that you can think of in which search engines would not be able to read subsequent pages? Assuming that this is done using Ajax should we be okay?
(Please bear with me, my specialty is link building and content, not the technical stuff
-
To add some info... if you want infinite scrolling + paginated pages, check out Google's demo.
-
Hi Michelle!
Did Oleg settle this for you?
-
use #2 if search engines can see the content that would be on the other pages. if they can't see it, then use rel=next/prev
-
Thank you! Just to clarify, this is a site that is being built from scratch so my question really is, what would be the best way to go about this?
-
use rel=next/prev and cannonical tags if necessary
-
ensuring that the URL doesn't change between pages 1,2, and 3 within the category
-
-
If the paginated pages still exist (and can be indexed) but you want to load more products dynamically, without changing the URL, using rel=next/prev would work fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl solutions for landing pages that don't contain a robots.txt file?
My site (www.nomader.com) is currently built on Instapage, which does not offer the ability to add a robots.txt file. I plan to migrate to a Shopify site in the coming months, but for now the Instapage site is my primary website. In the interim, would you suggest that I manually request a Google crawl through the search console tool? If so, how often? Any other suggestions for countering this Meta Noindex issue?
Technical SEO | | Nomader1 -
Getting rid of pagination - redirect all paginated pages or leave them to 404?
Hi all, We're currently in the process of updating our website and we've agreed that one of the things we want to do is get rid of all our pagination (currently used on the blog and product review areas) and instead implement load more on scroll. The question I have is... should we redirect all of the paginated pages and if so, where to? (My initial thoughts were either to the blog homepage or to the archive page) OR do we leave them to just 404? Bear in mind we have thousands of paginated pages 😕 Here's our blog area btw - https://www.ihasco.co.uk/blog Any help would be appreciated, thanks!
Technical SEO | | iHasco0 -
Finding websites that don't have meta descriptions
Hi everyone, as a way to find new business leads I thought about targeting websites that have poor meta descriptions or where they are simply missing. A quick look at SERPs shows this is still a major issue for many businesses. Is there any way I can quickly find pages for which meta description is lacking? Thank you! Best regards, Florian
Technical SEO | | agencepicnic0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
When creating parent and child pages should key words be repeated in url and page title?
We are in the direct mail advertising business: PrintLabelAndMail.com Example: Parent:
Technical SEO | | JimDirectMailCoach
Postcard Direct Mail Children:
Postcard Mailings
Postcard Design
Postcard Samples
Postcard Pricing
Postcard Advantages should "postcard" be repeated in the URL and Page Title? and in this example should each of the 5 children link back directly to the parent or would it be better to "daisy chain" them using each as parent for the next?0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
Google counting numbers of products on category pages - what about pagination ?
Hi there, Whilst checking out the SERPS, as you do, I noticed that where our category page appears, google now seems to be counting the number of products (what it calls items) on the product page and displaying this in the 1st part of the description (see image attached). My problem is we employ pagination, so that our category page will have 15 items on it, then there are paginated results for the rest, with either ?page=2 or page-2/ etc. appended to the URL. Although this is only a minor issue, I was just wondering if there was a way to change the number of products displayed on that page to be the entire number of products in that category, is there a microformat markup or something that can over-ride what google has detected ? Furthermore is this system of pagination effective ? I have considered using javascript pagination, such that all products would be loaded on to the one page but hidden until 'paginated', but I was worried about having hidden elements on the page, and also the impact of load times. Although I think this may solve the problem and display the true number of products in a section! Any help much appreciated, Stuart b4urme.jpg
Technical SEO | | stukerr0