Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Change of Address Tool Issue
We're currently migrating few "event" mini sites to a main site that will have a subfolder for each event.. Example:
Intermediate & Advanced SEO | | RichardUK
newsite.com/event1 The issue is that Search console is not able to verify this kind of redirect:
example.com --> 301 --> newsite.com/event Do you know any work around for this? I was thinking of using a subdomain instead which will in turn redirect to the /event subfolder.. but with each hop it will diminish the link's strength. I prefer not to leave it as subdomain as data gets mashed up in Google Analytics with subdomains and we have seen worse ranking results with subdomains. Any help is greatly appreciated.0 -
Consolidate URLs on Wordpress?
Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.
Intermediate & Advanced SEO | | wickstar1 -
Url structure on product pages - Should we apply canonicalized links in breadcrumbs or entry folders
We have products in the that go into mulitiple categories on our e-commerce site. But of course, each product is only canonicalized to one category. My question is: what should the breadcrumbs look like when users access a product from a non-canonicalized/primary category ?Should we apply canonicalized links in breadcrumbs or entry folders? For example: Let´s say we have product called "glacier hiking in the alps". It is in two categories; 1) glacier hiking 2) mountain tours. And is canonicalized to the glacier hiking category. If a user accesses it from the mountain tours category, should the url/breadcrumbs look like this: www.example.com/glacier-hiking/glacier-hiking-in-the-alps (because that is the canonicalized version) Or should it look like like this: www.example.com/mountain-tours/glacier-hiking-in-the-alps (because that is where the user came from) Thanks in advance!
Intermediate & Advanced SEO | | guidetoiceland0 -
How to deal with canonicals on dup product pages in Opencart?
So I have a seriously large amount of duplicate content problems on my Opencart site, and I've been trying to figure out the best way to fix them one by one. But is there a common, easy way of doing this? Because frankly, it is a nightmare otherwise. I bought an extension which doesn't appear to work (http://www.opencart.com/index.php?route=extension/extension/info&extension_id=20468&utm_source=ordercomplete&utm_medium=email&utm_campaign=wm), so now I'm at a loss.
Intermediate & Advanced SEO | | moon-boots0 -
Moz page optimization score issue, have a score of 95, but can get to 99 if I ad my keyword basically twice in the url.
Hello, I have a keyword for lack of providing too much info we will say my keyword is laptop-bags. Now we have a /laptop-bags/ page and inside that page **/laptop-bags/leather-shoulder/ ** We got a score of 95 for that page. Now I got a score of 99 when I changed it to **/laptop-bags/leather-shoulder-laptop-bags/ ** The way Bigcommerce handles is it will use the product category title in the url, page title and site links, to me it feels like it's spammy, as well as on my /laptop-bags/ page, I now have 18 keywords of " laptop bags " on that page when before it was 12, since I added laptop-bags to all 6 categories inside the laptop-bags page. How would you handle this, use the /keyword/ then /longtail-keyword/ in full or would using /laptop-bag/leather-shoulder/ still rank for leather shoulder laptop bags? I've asked this before and was told to use whatever sounded better to the user, but now moz is telling me different.
Intermediate & Advanced SEO | | Deacyde0 -
Does alt tag optimization benefit search rankings (not image search) at all?
The benefits of alt tag optimization for traditional SEO has always been a "yo yo" subject for me. Way back in the day (2004 to 2007) I believed there was some benefit to alt tag SEO. However as time went on I saw evidence that the major search engines were no longer considering alt tag SEO as a ranking signal. However I later had the pleasure to work on a joint project with a high end SEO firm in 2011/2012. My colleagues fully believed that alt tag optimization was still a very important strategy for traditional SEO at that time. Is there any evidence available that alt tags still help with traditional SEO nowadays? I'm fully aware of the benefits of optimized alt tags and image search. However could optimized alt tags be one of those ranking factors that Google removed due to abuse and later quietly resurrected?
Intermediate & Advanced SEO | | RosemaryB0 -
Wordpress Comments Pagination
Hi Mozzers What is your view on the following. Should you Paginate comments to increase page speed? If yes, at what # of comments would you begin pagination? (with the objective being decreasing page load times) Apply rel="canonical" back to the main article URL? eg: url/comment-page-1 => url noindex the comment pages? create a "View all" comments page? Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | jeremycabral
J0 -
How to fix duplicated urls
I have an issue with duplicated pages. Should I use cannonical tag and if so, how? Or should change the page titles? This is causing my pages to compete with each other in the SERPs. 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' is also used on http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |
Intermediate & Advanced SEO | | Melia0