Canonical pagination content
-
Hello
We have a large ecommerce site, as you are aware that ecommerce sites has canonical issues, I have read various sources on how best to practice canonical on ecommerce site but I am not sure yet..
My concert is pagination where I am on category product listing page.. the pagination will have all different product not same however the meta data will be same so should I make let's say page 2 or 3 to main category page or keep them as is to index those pages?
Another issue is using filters, where I am on any page and I filter by price or manufacturer basically the page will be same so here It seems issue of duplicate content, so should I canonical to category page only for those result types?
So basically If I let google crawl my pagination content and I only canonical those coming with filter search result that would be best practice? and would google webmaster parameter handling case would be helpful in this scenario ?
Please feel free to ask in case you have any queries
regards
Carl -
Google just announced some tags to help support pagination better. They say if you have a view all option that doesn't take too long to load, searchers generally prefer that, so you can rel=canonical to that page from your series pages. However, if you don't have a view all page, then you can put these nifty rel="next" and rel="prev" tags in to let Google know your page has pagination, and where the next and previous pages are.
View all: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
next/prev: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
-
I checked your site, and don't know whether you already changed it or not, but it looks pretty good. I have dealt with much more hardcore issues, meaning where you have tons of products in each category, several filters which can be freely permutated, and in the meantime you were able to paginate as well. There were a lot of canonical issues, so your case is an easy ride, believe me.
Here are a few tips, and I reason why I suggest them:
1) cutting back your navigation on deeper pages
I just quickly checked how many links is included in your site-wide navigation with Google Spreadsheet:
=ImportXML("http://www.cnmonline.co.uk/Dehumidifiers-c-778.html","//h6/a/@href")
And it got back 142 links. Whoa, thats a lot. And that many links are included in all of your pages, and the navigation is placed BEFORE your content. I had this very same issue with a client, they were hesitating to change the navigation, but eventually it helped them, a lot.
The suggested solution:
- wipe out the drop menu links from deeper pages
- only link to the big categories: "Air Treatment", "Bathroom", ... "Cleaning Products"
- in the category you are in, you can link to subcategories (without any javascript/css drop menu, just simply list them beneath the main category with a different background than darkblue), for example if you are in the Bathroom category, your left navigation will look like:
- Air Treatment
- Bathroom
- Electric Showers
- Mirror Demister
- Bathroom Heaters
- Heated Towel Rails
- Catering Equipment
- ...
- Cleaning Products
So this way you don't have to change a lot in your navigation, and it will make your interlinking more consistent. Furthermore if a user wants to find an another category, there is the search box, the main categories, and the breadcrumb. Which leads to the next suggestion:
2) Make the breadcrumb look like a breadcrumb, not like a tab.
This is just a personal feeling, but now it looks like a tab rather than a breadcrumb. These add up resulting in my feeling: "item1 | item 2 | item3" without underlining the links (so they not looking and feeling like links), and not beginning at the left side of the site, instead next to the left navigation.
Suggested solution:
- move your breadcrumb to the very left side of your site, above your navigation box, you can position it to start from the left side as your navigation box starts (it looks like 15px padding from the left side of the white background)
- the text can be smaller, but make the links underlined, to look like links
- change the pipeline ("|") character with a greater than character (">"), that's much more like a breadcrumb
3) make your pagination link follow, and the pagination pages meta "follow,noindex"
Now at the moment you have nofollowed your pagination links, which results in lower indexation between your product pages than it would possible.
Eg:
- this is cached: http://webcache.googleusercontent.com/search?q=cache:www.cnmonline.co.uk/Bathroom-Products-c-2278.html&hl=en&strip=1
- but the 2nd page isn't: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FBathroom-Products-c-2278-p-2.html
- and whats even worse, but not surprising, this item on the second page isn't indexed: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FSavoy-Shawl-Collar-Bath-Robe-Box-of-5-pr-36295.html
Suggested solution:
- let the google bot follow your pagination links, remove the rel nofollow attribute from the links
- make the pagination pages meta robots "follow,noindex"
This change means the google bot can follow your product pages, but won't index those paginated pages. This is awesome, since you don't want to hassle with unique title, description, and the pagination pages are just lists, they don't give any additional value or any reason to be indexed.
Of course if you had pagination issue with reviews, then it would be a whole different story, because then each paginated pages would be valuable, since they are listing valuable user generated content, and not just essentially linking to product pages. So in that case, you might create unique titles and description at least by adding "page X".
4) Your filters aren't causing duplication / canonical issue, since they work on an ajax basis, and they don't create any new url.
So here you shouldn't change anything, but I guess this don't surprise you. You can always check this, by using 'cache:' in google and selecting text-only version, for example: "cache:http://www.cnmonline.co.uk/Bathroom-Heaters-c-2320.html", click text-only version, and you will see that Price Range and Manufacturer have no links which google could follow, so no canonical problem.
Hope this helps.
-
Is it best method to get canonical url redirect with paging to view all pages including all the urls coming with price and sorting filters? any other members would like to share their opinions?
regards
Carl
-
View all! Of course... how did I not think of that before? Thank you.
-
Concerning Pagination,
I would create a "view all" where all the products are listed under this category. then i add rel canonical linking to the "View All " page.
its can help you with your first question and for the issue using filters.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and canonical
In my sitemap I have two entries for my page ContactUs.asp ContactUs.asp?Lng=E ContactUs.asp?Lng=F What should I use in my page ContactUS.asp ? Is this correct?
Technical SEO | | CustomPuck0 -
Moving content to a new domain
I need to move a lot of content with podcasts and show notes to a new domain. Instead of doing redirects, we want to keep some content on the current domain to retain the link value. There are business reason to keep content on both websites but the new website will primarily be used for SEO moving forward.If we keep the audio portion of the podcast on the old website and move the show notes and the audio portion of the podcast to the new website, is there any issues with duplicate content?Long-term, I presume Google will re-index the old and the new pages, thus no duplicate content, but I want to make sure I'm not missing anything. I was planning to fetch pages in Search Console as we migrate content.Thanks for your help!
Technical SEO | | JimmyFritz0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
Rel="canonical"
HI, I have site named www.cufflinksman.com related to Cufflinks. I have also install WordPress in sub domain blog.cufflinksman.com. I am getting issue of duplicate content a site and blog have same categories but content different. Now I would like to rel="canonical" blog categories to site categories. http://www.cufflinksman.com/shop-cufflinks-by-hobbies-interests-movies-superhero-cufflinks.html http://blog.cufflinksman.com/category/superhero-cufflinks-2/ Is possible and also have any problem with Google with this trick?
Technical SEO | | cufflinksman0 -
Duplicate Content Due to Pagination
Recently our newly designed website has been suffering from a rankings loss. While I am sure there are a number of factors involved, I'd like to no if this scenario could be harmful... Google is showing a number of duplicate content issues within Webmaster Tools. Some of what I am seeing is duplicate Meta Titles and Meta Descriptions for page 1 and page 2 of some of my product category pages. So if a category has many products and has 4 pages, it is effectively showing the same page title and meta desc. across all 4 pages. I am wondering if I should let my site show, say 150 products per page to get them all on one page instead of the current 36 per page. I use the Big Commerce platform. Thank you for taking the time to read my question!
Technical SEO | | josh3300 -
Problem with Rel Canonical
Background: We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply. Clearly I am doing something wrong here, how do I check my various pages to see where the problem lies and how do I go about fixing it?
Technical SEO | | SallySerfas0 -
Canonical Issues with Wordpress
Hi all, I have just started using Wordpress SEO by Yoast and still having a hard time correcting my Canonical issues for all posts with a .html at the end. The pluggin allows you to add a '/' to the end for canonical issues, but just for pages, not posts. How best in Wordpress to make my post change from .html/ to .html. I really don't want to go to the hassle to make each URL a new 301 redirect in my .htaccess. I hate the .html, but if they are going to stay, how can I make sure I get the .html/ link juice back to them. Many thanks!
Technical SEO | | RunningInTheRain0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0