Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an issue with my site?
Been mostly hanging around top of page two for the last couple of years for “Liverpool Wedding photographer” although got myself on page 1 for “Liverpool photographer” I have split the title of the page to target these two keywords. I took the Liverpool photographer off the title to see if it was being detrimental to the “Liverpool wedding photographer” I didn’t see no increase in ranking so put it back as I get a bit of commercial work from it. Since last year I have got onto page 1 at least three times around position 5-6. Within a week or two I start sliding down again and end up back at top of page two. I could understand this slow push out if my competitors were busy SEO wise but from what I have seen they are not. There is a guy using the keywords in URL and calls himself “Liverpool wedding photographer” last time I checked he literally had no links but is in the first 5 positions. I have I think a better link profile than every one else. Although I am on and off with Facebook and Instagram, (more off) so that probably isn’t helping. Although I have a colleague in the video side of things and he doesn’t use social media at all and it hasn’t harmed him. A few years ago I was burned quite badly by a total charlatan. He sunk my home page to page 4. He talked the talk about creating landing pages but his methods were shoddy to say the least. I can’t believe I was taken in by him, although I was only with him for 2 months. He was still using spammy link techniques to generate lots of toxic links for me! I disavowed all of his links and put the keywords back on the home page and was back to my usual top of page 2 position within a week. Since then I have disavowed all directory links and anything not wedding related. I have an article which ranks 1st or second for “Nikon CLS”. I have also another article of 2000 words or so on another reasonable placed photography website. A few links from other vendors or people I have taken photographs for. I have about 10 featured weddings with a link on 4 good weddings blogs. I don’t think a massive amount of blog comments although I have stopped doing this. If I look at most of the competitors these are their main links, with directories as well! Last winter I put a quite substantial article about documentary wedding photography on my home page. I flew to number 2, although I photographed The World Transformed (the alternative labour conference in Liverpool). I got a lot of clicks to a gallery page (few thousand off social media} so I don’t know if that coincided with it. Same thing – watching the website go down a few positions every day until within just over a week or two I was about 4<sup>th</sup> on page 2! Its like my website is on a spring which can push into page 1 but rebounds back to top of page 2. I am staring to worry that my site has been marked as a bad character in some way because I get what seems to be rough treatment from google compared to my peers. I have written I think 4 or 5 (1500 word) articles the last couple of months talking about lenses and wedding photography related topics and Google pushed me back to page 1, peaking At position 5. I was there for a few weeks and then the slide happened again. Bit demoralised at the moment, what to do? Any help or pointers would be most appreciated. Best wishes. David.
Intermediate & Advanced SEO | | WallerD0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
SEO issues with Magento
Hi Everyone, We use Magento CMS for our site and we are having a frustrating time resolving our SEO issues. The site was very poorly managed in years past and in the past year I have redesigned and cleaned up many things. However we are recently having trouble with indexing and keyword ranking. Issue #1: Our main keyword ranking has dropped quite a bit while our other less important keywords have steadily risen. I suspect a very strict robots.txt implemented back in early January may have been the culprit. We have since been modifying it with out much luck. Many of our pages are still blocked. 12/05/12 : ranked 12th 1/09/13: ranked 19th 1/16/13: ranked 35th Now: out of top 50 (52nd) Issue #2: Not a single image is being indexed. We are 0 for 582 according to Webmaster tools. Not sure why... Any help and advice would be greatly appreciated as I have great determination and interest in learning the correct way to fix/do this. Site: www.scojo.com Thanks
Intermediate & Advanced SEO | | t_parrish0 -
Renaming a URL
Hi, If we rename a URL (below) http://www.opentext.com/2/global/company/company-ecm-positioning.htm
Intermediate & Advanced SEO | | pstables
to http://www.opentext.com/2/global/products/enterprise-content-management.htm (or something similar) Would search engines recognize that as a new page altogether? I know they would need to reindex it accordingly, so in theory it is kind of a "new" page. But the reason for doing this is to maintain the page's metrics (inbound links, authority, social activity, etc) instead of creating a new page from scratch. The page has been indexed highly in the past, so we want to keep it active but optimize it better and redirect other internal content (that's being phased out) to it to juice it up even more. Thanks in advance!
Greg0 -
Should product searches (on site searches) be noindex?
We have a large new site that is suffering from a sitewide panda like penalty. The site has 200k pages indexed by Google. Lots of category and sub category page content and about 25% of the product pages have unique content hand written (vs the other pages using copied content). So it seems our site is labeled as thin. I'm wondering about using noindex paramaters for the internal site search. We have a canonical tag on search results pointing to domain.com/search/ (client thought that would help) but I'm wondering if we need to just no index all the product search results. Thoughts?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Advanced search operators
Hi mozzers, Anyone know how to do a yahoo search to return results that meet have a specific URL For example, say I wanted to search for the keyword finance and I only wanted it to return results with the word "business" anywhere in the URL. I have done in the past, but I just cant remember how to do it! Thanks
Intermediate & Advanced SEO | | PeterM220