Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inactive Products - Inactive URLs
Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,
Intermediate & Advanced SEO | | viatrading11 -
How should I deal with this page?
Hey Mozzers, I was looking for a little guidance and advice regarding a couple of pages on my website. I have used 'shoes' for this example. I have the current structure Parent Category - Shoes Sub Categories - Blue Shoes
Intermediate & Advanced SEO | | ATP
Hard Shoes
Soft Shoes
Big Shoes etc Supporting Article - Different Types of Shoe and Their Uses There are about 12 subcategories in total - each one links back to the Parent Category with the keyword "Shoes". Every sub category has gone from ranking 50+ to 10-30th for its main keyword which is a good start and as I release supporting articles im sure each one will climb. I am happy with this. The Article ranks no1 for about 20 longtails terms around "different shoes". This page attracts around 60% of my websites traffic but we know this traffic will not convert as most are people and children looking for information only for educational purposes and are not looking to buy. Many are also looking for a type of product we dont sell. My issue is ranking for the primary category "Shoes" keyword. When i first made the changes we went from ranking nowhere to around 28th on the parent category page targeted at "Shoes". Whilst not fantastic this was good as gave us something to work off. However a few weeks later, the article page ranked 40th for this term and the main page dropped off the scale. Then another week some of the sub category pages ranked for it. And now none of my pages rank in the top 50 for it. I am fairly sure this is due to some cannibalisation - simply because of various pages ranking for it at different times.
I also think that additional content added by products on the sub category pages is giving them more content and making them rank better. The Page Itself
The Shoes page itself contains 400 good unique words, with the keyword mentioned 8 times including headings. There is an image at the top of the page with its title and alt text targeted towards the keyword. The 12 sub categories are linked to on the left navigation bar, and then again below the 400 words of content via a picture and text link. This added the keyword to the page another 18 or so times in the form of links to longtail subcaterogies. This could introduce a spam problem i guess but its in the form of nav bars or navigation tables and i understood this to be a necessary evil on eCommerce websites. There are no actual products linked from this page. - a problem? With all the basic SEO covered. All sub pages linking back to the parent category, the only solution I can think of is to add more content by Adding all shoes products to the shoe page as it currently only links out the the sub categories Merging the "Different Type of Shoe and Their Uses" article into the shoe page to make a super page and make the article pages less like to produce cannibalistic problems. However, by doing solution 2, I remove a page bringing in a lot of traffic. The traffic it brings in however is of very little use and inflates the bounce rate and lowers the conversion rate of my whole site by significant figures. It also distorts other useful reports to track my other progress. I hope i have explained well enough, thanks for sticking with me this far, i havn't posted links due to a reluctance by the company so hopefully my example will suffice. As always thanks for any input.0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
URL construction in 2014
Hey guys, I was wondering if you could tell me your thoughts about how a URL is perceived by the algo in 2014? For example: http://www.moneyexpert.com/reviews/credit-cards/amex-platinum/ and lets say http://www.moneyexpert.com/reviews_credit-cards_review_amex-platinum.html In the eyes of google do both different style of url generally help google understand the same result? or will the keyword rich html url have a bigger benefit? I am looking forward to your advice on this matter. I don't plan on doing a lot of SEO but rather letting nature take its course so to speak... so i just wanted to make sure i construct this site with 'best practice'.
Intermediate & Advanced SEO | | irdeto0 -
Urls in Bilingual websites
1-I have a bilingual website. Suppose that I am targeting a page for keyword "book" and I have included it in that page url for the English version: English version: www.abc.com/book Can I use the translation of "book" in the second language of the website url instead of "book" ? Please let me know which of the following urls are right " French Verison: www.abc.com/fr/book or www.abc.com/fr/livre livre=Book in French 2- Does Google have any tool to check if the second language page of the website has exactly the same content as the English version. What I want to do is for example for a certain page in English version, my targeted keyword is "book" . So my content would be around books. But in the French version of this page, I want to focus on keyword "Pencil" in French instead of "book". Is it wrong or any consequences? That was the main reason for the question number one. Because if it is ok to do what I explained in item 2 then I will set my urls like: In English : www.abc.com/book In French: www.abc.com/fr/crayon crayon=Pencil in French
Intermediate & Advanced SEO | | AlirezaHamidian0 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Canonical URLs and Sitemaps
We are using canonical link tags for product pages in a scenario where the URLs on the site contain category names, and the canonical URL points to a URL which does not contain the category names. So, the product page on the site is like www.example.com/clothes/skirts/skater-skirt-12345, and also like www.example.com/sale/clearance/skater-skirt-12345 in another category. And on both of these pages, the canonical link tag references a 3rd URL like www.example.com/skater-skirt-12345. This 3rd URL, used in the canonical link tag is a valid page, and displays the same content as the other two versions, but there are no actual links to this generic version anywhere on the site (nor external). Questions: 1. Does the generic URL referenced in the canonical link also need to be included as on-page links somewhere in the crawled navigation of the site, or is it okay to be just a valid URL not linked anywhere except for the canonical tags? 2. In our sitemap, is it okay to reference the non-canonical URLs, or does the sitemap have to reference only the canonical URL? In our case, the sitemap points to yet a 3rd variation of the URL, like www.example.com/product.jsp?productID=12345. This page retrieves the same content as the others, and includes a canonical link tag back to www.example.com/skater-skirt-12345. Is this a valid approach, or should we revise the sitemap to point to either the category-specific links or the canonical links?
Intermediate & Advanced SEO | | 379seo0 -
Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc
I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
Intermediate & Advanced SEO | | James77
/aclk - No cd value
/search - No cd value
/url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!0