Set Canonical for Paginated Content
-
Hi Guys,
This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors#
I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible.To point Google in the right direction I want to let them know that "View All" is the best way to go.
I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website.Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
Thanks for the help!Joost
-
Joost - that's correct! Yes, I assume woocommerce, since they are product pages.
-
Hi Dan,
Thanks for the explanation.
Ok so I block 24 and 48 for Google but users can still use them to navigate through the site.
I assume this is woocommerce related because Woocommerce creates the output for the productpages right?Thanks again!
Joost
-
Joost
I think you'll need to get a developer or someone involved to help execute, but here's the ideal scenerio:
- Add meta "noindex" tags to ?show_products=24 and 48
- Make your 'view all' URL ideally just /product-category/t-shirts/ - with no parameter - or if you have to, maybe /t-shirts/all/ - your goal here is to keep it consistent and NOT the same parameter as the other pages
- Then, whatever consistent URL you have for the 'all' - don't add "noindex" to that (keep it indexable).
- Wait for Google to remove 24/48 URLs from the index (you have to just check every week or two with site: searches)
- Once they are noindexed, block crawling with robots.txt with this line:
Disallow: /?show_products= <---but ONLY use that if you've changed your 'view all' URLs to something else! You ideally want a different URL structure for 'view all' vs. not view all to control crawling and indexation more easily.
-
Hi Dan,
Thanks for your reply.
For the category t-shirt I've got this:/product-category/t-shirts/?show_products=24 (24)
/product-category/t-shirts/?show_products=48 (48)
/product-category/t-shirts/?show_products=41 (when All selected)Let me know! And thanks again for your time! Really apreciate it!Joost
-
Hi Joost
Can you provide examples of how all your URLs are setup? What does the URL look like for view all, 24 items etc etc?
-
Wow Dan!
Thanks for looking in to this!
I assume you are totally right but have no idea how I should implement this strategy on my site. It just a plain wordpress install with woocommerce. I use Yoast (ofcourse) but never went in-depth with robot.txt.
How can I provide you with more info? Or better; myself
Thanks again,
Joost
-
Hi Joost
It would be better to just "noindex" anything except view all. Then once they are gone from the index, set a block in robots.txt so they can't be crawled anymore. That fixes the issue at the source, the canonical is more of a bandaid. So:
1. Add a meta "noindex" tag to everything except view all (I am not 100% sure how in your wordpress setup - there's no one way, it depends on your setup).
2. Monitor the indexation of these pages in Google and wait for them to be removed (you can check with just searching for the URL in the search bar).
3. Once they are all gone from the index, block crawlers from accessing them by adding a line to your robots.txt file blocking the 24/48 URLs - again, I don't know the exact code for your robots.txt because I am unsure of your URL setup, but a dev or someone can help - or feel free to write back with these details and I'll try to help further.
-
Hi Patrick,
Thanks for helping out. I've read a lot about the theory behind View All and why & when it's better to set canonicals on page 2 and 3 to View All.
But I can't seem to find any information how to implement the rel canonical in wordpress/woocommerce.I know that Google will try to sort it out by itself (if View All) is available but helping them with a canonical will solve a lot of 404 crawls on our site.
Any ideas?Joost
-
Hi Joost
Did you happen to take a look at SEO Guide to Google Webmaster Recommendations for Pagination? There are some great tips in there that can help you implement this.
Also, View-all in search results & 5 common mistakes with rel=canonical from Google also has some tips.
Hope these help a bit! Let me know if you have any questions or comments! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will canonical solve this?
Hi all, I look after a website which sells a range of products. Each of these products has different applications, so each product has a different product page. For eg. Product one for x application Product one for y application Product one for z application Each variation page has its own URL as if it is a page of its own. The text on each of the pages is slightly different depending on the application, but generally very similar. If I were to have a generic page for product one, and add canonical tags to all the variation pages pointing to this generic page, would that solve the duplicate content issue? Thanks in advance, Ethan
Technical SEO | | Analoxltd0 -
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Using canonical for duplicate contents outside of my domain
I have 2 domains for the same company, example.com and example.sg Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page. Any idea if this is the right decision Thanks
Technical SEO | | MohammadSabbagh0 -
Moving content
I have www.SiteA.com which contains a number of sections of content, a section of which (i.e. www.SiteA.com/sectionA), we would like to move to a new domain www.SiteB.com Definitely we will ensure that a redirect strategy is in place and that we submit a sitemap for SiteB Three Questions 1. Anything else I am missing from the migration plan? 2. Since we are only moving part of SiteA to SiteB, is there another way of telling Google that we changed address for that section or are the 301s enough? 3. Currently, Section A (under SiteA) contains a subsection where we were posting an article a day. In the new site (SiteB), we decided to drop this subsection and write content (but not "exactly" the same content) under a new section. During migration, how should we handle the subsection that we have decided to stop writing? Should we: A. Import the content into SiteB and call it archives and then redirect all the urls from subsection under SiteA to the archives under SiteB? OR B. Do not move the content but redirect all the pages (365 in total) to where we think the user would be more interested in going to on SiteB? Note: A colleague of mine is worried that since the subsection has good content he thinks its necessary to actually move the content to SiteB. But again, looking at the views for the archives it caters for 1% of the the total views of this section. In other words, people only view the article on the day it is written. I hope I was clear 🙂 Your help is appreciated Thank you
Technical SEO | | seo12120 -
Follow up to Archive of Content
This is a follow up to the Question I ask: http://www.seomoz.org/q/archive-of-content I have decided that I am going to move the articles from example.com (non-commercial) to website.com (commercial) however I was having a think, some of the articles on example.com and ranking well for some keywords, maybe getting around 20,000 visits from natural search, would it be possible when moving this article just to do a 301 redirect from the page with the article example.com to the new website? Hope that makes some sense. Kind Regards,
Technical SEO | | Paul780 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0