Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I remove certain parameters from the canonical URL?
For example, https://www.jamestowndistributors.com/product/epoxy-and-adhesives?page=2&resultsPerPage=16 is the paginated URL of the category https://www.jamestowndistributors.com/product/epoxy-and-adhesives/. Can I remove the &resultsPerPage= variation from the canonical without it causing an issue? Even though the actual page URL has that parameter? I was thinking of using this: instead of: What is the best practice?
Intermediate & Advanced SEO | | laurengdicenso0 -
SEO effect of URL with subfolder versus parameters?
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
Intermediate & Advanced SEO | | Searchout0 -
Working to Start an Shopping Idea Site - Which Totally Based On Scraping product from Ecom. How Quickly I should Add products and categories in this new domain.
How Quickly I should Add products and categories in this new domain. We are going to start its promotional by google adwords and facebook. I worrying about 10000's of product pages. kindly guide me.
Intermediate & Advanced SEO | | innovatebizz0 -
Do Local Search Efforts (Citations, NAP, Reviews) have an impact on traditional organic search listings (without the A, B, C mapping icons), but rather the traditional listings?
Are citations, NAP, Reviews, and other local search efforts impact traditional SEO listings? Can one elaborate?
Intermediate & Advanced SEO | | JQC0 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
Spaces in URL line
Hi Gurus, I recently made the mistake of putting a space into a URL line between two words that make up my primary key word. Think www.example.com/Jelly Donuts/mmmNice.php instead of www.example.com/JellyDonuts/mmmNice.php This mistake now needed fixing to www.example.com/Jelly Donuts/mmmNice.php to pass W3, but has been in place for a while but most articles/documents under 'Jelly Donuts' are not ranking well (which is probably the obvious outcome of the mistake). I am wondering whether the best solution from an SEO ranking viewpoint is to: 1. Change the article directory immediately to www.example.com/JellyDonuts/mmmNice.php and rel=canonical each article to the new correct URL. Take out the 'trash' using robots.txt or to 301 www.example.com/Jelly Donut to the www.example.com/JellyDonut directory? or perhaps something else? Thanks in advance for your help with this sticky (but tasty) conundrum, Brad
Intermediate & Advanced SEO | | BM70 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How to deal with 1 product in 1 country and 3 languages?
After reading multiple posts on dealing with multilanguage sites (also checked http://www.google.com/support/forum/p/Webmasters/thread?tid=12a5507889c20461&hl=en), I still haven't got an answer to a very specific question I have. Please allow me to give some background:
Intermediate & Advanced SEO | | TruvoDirectories
I'm working for the official Belgian Yellow Pages (part of Truvo), and as you might know in Belgium, we have to deal with 3 official languages (BE-nl, BE-fr, BE-de | the latter is out of scope for this question) and on top of that we also have a large international audience (BE-en). Furthermore, Belgium is very small, meaning that someone living in the French part of Belgium (ex. Liège) easily might look for information in the Dutch part of Belgium (ex. Antwerpen) without having to switch websites/language. Since 1968 (http://info.truvo.be/en/our-company/) we have established 3 different brands, each brand is adapted to a language, each has a clear language specific connotation:
for the BE-nl market: we have the brand "gouden gids"
for the BE-fr market: we have the brand "pages dor"
for the BE-en market we have the brand "golden pages" Logically, this results in 3 websites: www.goudengids.be, www.pagesdor.be, www.goldenpages.be each serving a specific language and containing specific language messages and functionalities, but, off course, serving a part of the content that is similar for all websites regardless of the language.
So we do have following links ex.
http://www.goudengids.be/united-consultants-nv-antwerpen-2000/
http://www.pagesdor.be/united-consultants-nv-antwerpen-2000/
http://www.goldenpages.be/united-consultants-nv-antwerpen-2000/ When I want to stick with the separate brands for the same content, how do I make sure that Google shows the desired url when searching in resp. google.be (dutch), google.be (french) google.be (english)? Kind Regards0