Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Remove Product & Category from URLS in Wordpress
Does anyone have experience removing /product/ and /product-category/, etc. from URLs in wordpress? I found this link from Wordpress which explains that this shouldn't be done, but I would like some opinions of those who have tried it please. https://docs.woocommerce.com/document/removing-product-product-category-or-shop-from-the-urls/
Intermediate & Advanced SEO | | moon-boots0 -
Pagination new pages vs parameters
I'm working on a site that currently handles pagination like this cars-page?p=1 cars-page?p=2 In webmaster tools I can then tell ?p= designates pagination However I have a plugin I want to add to fix other seo issues, among those it adds rel="prev" rel="next" and it modifies the pagination to this cars-page-1.html cars-page2.html Notice I lost the parameter here and now each page is a different page url, pagination is no longer a parameter. I will not longer be able to specify the pagination parameter in webmaster tools. Would this confuse google as the pagination is no longer a parameter and there will now be multiple urls instead of one page with parameters? My gut says this would be bad, as I haven't seen this approach often on ecommerce site, but I wanted to see what the community thought?
Intermediate & Advanced SEO | | K-WINTER0 -
Migrating to WooCommerce, similar product descriptions but with different urls, cant use variations.
Hi! Ime quite new to SEO and to woocommerce so please help out with this one.. We are migrating from Ithemes Exchange over to WooCommerce and i have come up with some issues. We are selling adhesives and some of the products have the same name and description, the only thing that seperates them are sometimes the widht, or the length on the roll.. As we have it now we have a separate product page for each widht and length. For example here http://siga-sverige.se/siga/fentrim-2-100/ and here http://siga-sverige.se/siga/fentrim-2-150/ The above product pages are for a product called Fentrim 2. its availiable in widhts from 75 to 300mm.. so, its six diffent products pages with more or less the same description. I get that this will create duplicate content, couse the description on the pages are similar.. We cant use variations in woocommerce, couse this cant be set up to exactly match our shipping needs, so, we need them on separate pages.. Soo, my plan is to set a new product page for Fentrim 2, ex http://siga-sverige.se/siga/fentrim-2 and then set that url as canonical url for the variations of the product.. Am i on the right track? Gratefull for any help on this one! / Jonas
Intermediate & Advanced SEO | | knubbz1 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
To index search results or not?
In its webmaster guidelines, Google says not to index search results " that don't add much value for users coming from search engines." I've noticed several big brands index search results, and am wondering if it is generally OK to index search results with high engagement metrics (high PVPV, time on site, etc). We have an database of content, and it seems one of the best ways to get this content in search engines would be to allow indexing of search results (to capture the long tail) rather than build thousands of static URLs. Have any smaller brands had success with allowing indexing of search results? Any best practices or recommendations?
Intermediate & Advanced SEO | | nicole.healthline0 -
Domain Issue
Starting a new local travel guide site. Would like to buy a domain and have found one with decent Domain Authority and Trust, but they want $2500 for the domain which I feel is a bit steep since I will be not using any of the content and it is generating hardly any revenue now. . I would rather not start from scratch with no links and no trust. I have a few questions.... -Any suggestions on sites to look for domains or strategy for finding and offering to buy? Any guidelines on how to value domains? If I but it and change registration do I risk losing all the value? Cold I just change technical contact info? Any other suggestions are welcome. Thanks.
Intermediate & Advanced SEO | | Reportcard0