Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URL Parameters for Blog Articles
Hi there, I'm working on a site which is using parameter URLs for category pages that list blog articles. The content on these pages constantly change as new posts are frequently added, the category maybe for 'Heath Articles' and list 10 blog posts (snippets from the blog). The URL could appear like so with filtering: www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016 www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=1 All pages currently have the same Meta title and descriptions due to limitations with the CMS, they are also not in our xml sitemap I don't believe we should be focusing on ranking for these pages as the content on here are from blog posts (which we do want to rank for on the individual post) but there are 3000 duplicates and they need to be fixed. Below are the options we have so far: Canonical URLs Have all parameter pages within the category canonicalize to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general and generate dynamic page titles (I know its a good idea to use parameter pages in canonical URLs). WMT Parameter tool Tell Google all extra parameter tags belong to the main pages (e.g. www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=3 belongs to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general). Noindex Remove all the blog category pages, I don't know how Google would react if we were to remove 3000 pages from our index (we have roughly 1700 unique pages) We are very limited with what we can do to these pages, if anyone has any feedback suggestions it would be much appreciated. Thanks!
Intermediate & Advanced SEO | | Xtend-Life0 -
Migrating to WooCommerce, similar product descriptions but with different urls, cant use variations.
Hi! Ime quite new to SEO and to woocommerce so please help out with this one.. We are migrating from Ithemes Exchange over to WooCommerce and i have come up with some issues. We are selling adhesives and some of the products have the same name and description, the only thing that seperates them are sometimes the widht, or the length on the roll.. As we have it now we have a separate product page for each widht and length. For example here http://siga-sverige.se/siga/fentrim-2-100/ and here http://siga-sverige.se/siga/fentrim-2-150/ The above product pages are for a product called Fentrim 2. its availiable in widhts from 75 to 300mm.. so, its six diffent products pages with more or less the same description. I get that this will create duplicate content, couse the description on the pages are similar.. We cant use variations in woocommerce, couse this cant be set up to exactly match our shipping needs, so, we need them on separate pages.. Soo, my plan is to set a new product page for Fentrim 2, ex http://siga-sverige.se/siga/fentrim-2 and then set that url as canonical url for the variations of the product.. Am i on the right track? Gratefull for any help on this one! / Jonas
Intermediate & Advanced SEO | | knubbz1 -
URL Structure Question
Am starting to work with a new site that has a domain name contrived to help it with a certain kind of long tail search. Just for fictional example sake, let's call it WhatAreTheBestRestaurantsIn.com. The idea is that people might do searches for "what are the best restaurants in seattle" and over time they would make some organic search progress. Again, fictional top level domain example, but the real thing is just like that and designed to be cities in all states. Here's the question, if you were targeting searches like the above and had that domain to work with, would you go with... whatarethebestrestaurantsin.com/seattle-washington whatarethebestrestaurantsin.com/washington/seattle whatarethebestrestaurantsin.com/wa/seattle whatarethebestrestaurantsin.com/what-are-the-best-restaurants-in-seattle-wa ... or what and why? Separate question (still need the above answered), would you rather go with a super short (4 letter), but meaningless domain name, and stick the longtail part after that? I doubt I can win the argument the new domain name, so still need the first question answered. The good news is it's pretty good content. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
SEO issues? New functionality added to website and now hash (in URL) - fragments
Hi All! We have new nice functionality on website, but now i doubt if we will have SEO issues. Duplicate content and if google is able to spider our website. See: http://www.allesvoorbbq.nl/boretti-da-vinci-nero.html#608=1370
Intermediate & Advanced SEO | | RetailClicks
With the new functionality we can switch between colors of the models (black / white / red / yellow).
When you switch with Ajax the content of other models is fetched without refreshing the page. (so the url initial part of url stays the same (for initial model) only part behind # changes. The other models are also accessible by there own url, like the red one: http://www.allesvoorbbq.nl/boretti-da-vinci-rosso.html#608=1372 So far so good. But now the questions: 1. We use to have url like /boretti-da-vinci-nero.html - also our canonical is that way But now if we access that url our system is adding automatically the #123-123 to the url to indicate which model(color) is shown. Is this hurting SEO or confusing google? Because it seems that the clean url is not accessible anymore? (it adds now #123-123) 2. Should we add some tags around the different types (colors) to prevent google from indexing that part of website? Every info would be very helpfull! We do not want to lose our nice rankings thanks to MOZ! Thanks all!
Jeroen0 -
Crawling issue
Hello, I am working on 3 weeks old new Magento website. On GWT, under index status >advanced, I can only see 1 crawl on the 4th day of launching and I don't see any numbers for indexed or blocked status. | Total indexed | Ever crawled | Blocked by robots | Removed |
Intermediate & Advanced SEO | | sedamiran
| 0 | 1 | 0 | 0 | I can see the traffic on Google Analytic and i can see the website on SERPS when i search for some of the keywords, i can see the links appear on Google but i don't see any numbers on GWT.. As far as I check there is no 'no index' or robot block issue but Google doesn't crawl the website for some reason. Any ideas why i cannot see any numbers for indexed or crawled status on GWT? Thanks Seda | | | | |
| | | | |0 -
Hash URLs
Hi Mozzers, Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it? Thanks in advance, Karl
Intermediate & Advanced SEO | | KarlBantleman0 -
Tracking URLS and Redirects
We have a client with many archived newsletters links that contain tracking code at the end of the URL. These old URLs are pointing to pages that don't exist anymore. Is there a way to set up permanent redirects for these old URLs with tracking code? We have tried and it doesn't seem to work. Thank you!
Intermediate & Advanced SEO | | BopDesign0 -
Exact keyword URL or not?
Hi all, I have a quick question about the proper use of permalinks. Let's say that I have a website about sports and I want to create an internal page dedicated to shoes. I know that the keyword "shoe" has 15.000 monthly visits, while the keyword "shoes" has 1.000 monthly visits. How do I have to name the internal page? http://www.example.com/shoe or http://www.example.com/shoes (with a final 's')? I would think that by naming the URL http://www.example.com/shoes, the search engine would consider that page for the keywords "shoe" and "shoes", but I am not sure about it. Should I create a URL that only focuses on one specific keyword ("shoe", in this example) or a URL that may encompass more than one keyword ("shoe" and "shoes")? I hope this is clear. Thank you for your time and help. All best, Sal
Intermediate & Advanced SEO | | salvyy0