Set Canonical for Paginated Content
-
Hi Guys,
This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors#
I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible.To point Google in the right direction I want to let them know that "View All" is the best way to go.
I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website.Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
Thanks for the help!Joost
-
Joost - that's correct! Yes, I assume woocommerce, since they are product pages.
-
Hi Dan,
Thanks for the explanation.
Ok so I block 24 and 48 for Google but users can still use them to navigate through the site.
I assume this is woocommerce related because Woocommerce creates the output for the productpages right?Thanks again!
Joost
-
Joost
I think you'll need to get a developer or someone involved to help execute, but here's the ideal scenerio:
- Add meta "noindex" tags to ?show_products=24 and 48
- Make your 'view all' URL ideally just /product-category/t-shirts/ - with no parameter - or if you have to, maybe /t-shirts/all/ - your goal here is to keep it consistent and NOT the same parameter as the other pages
- Then, whatever consistent URL you have for the 'all' - don't add "noindex" to that (keep it indexable).
- Wait for Google to remove 24/48 URLs from the index (you have to just check every week or two with site: searches)
- Once they are noindexed, block crawling with robots.txt with this line:
Disallow: /?show_products= <---but ONLY use that if you've changed your 'view all' URLs to something else! You ideally want a different URL structure for 'view all' vs. not view all to control crawling and indexation more easily.
-
Hi Dan,
Thanks for your reply.
For the category t-shirt I've got this:/product-category/t-shirts/?show_products=24 (24)
/product-category/t-shirts/?show_products=48 (48)
/product-category/t-shirts/?show_products=41 (when All selected)Let me know! And thanks again for your time! Really apreciate it!Joost
-
Hi Joost
Can you provide examples of how all your URLs are setup? What does the URL look like for view all, 24 items etc etc?
-
Wow Dan!
Thanks for looking in to this!
I assume you are totally right but have no idea how I should implement this strategy on my site. It just a plain wordpress install with woocommerce. I use Yoast (ofcourse) but never went in-depth with robot.txt.
How can I provide you with more info? Or better; myself
Thanks again,
Joost
-
Hi Joost
It would be better to just "noindex" anything except view all. Then once they are gone from the index, set a block in robots.txt so they can't be crawled anymore. That fixes the issue at the source, the canonical is more of a bandaid. So:
1. Add a meta "noindex" tag to everything except view all (I am not 100% sure how in your wordpress setup - there's no one way, it depends on your setup).
2. Monitor the indexation of these pages in Google and wait for them to be removed (you can check with just searching for the URL in the search bar).
3. Once they are all gone from the index, block crawlers from accessing them by adding a line to your robots.txt file blocking the 24/48 URLs - again, I don't know the exact code for your robots.txt because I am unsure of your URL setup, but a dev or someone can help - or feel free to write back with these details and I'll try to help further.
-
Hi Patrick,
Thanks for helping out. I've read a lot about the theory behind View All and why & when it's better to set canonicals on page 2 and 3 to View All.
But I can't seem to find any information how to implement the rel canonical in wordpress/woocommerce.I know that Google will try to sort it out by itself (if View All) is available but helping them with a canonical will solve a lot of 404 crawls on our site.
Any ideas?Joost
-
Hi Joost
Did you happen to take a look at SEO Guide to Google Webmaster Recommendations for Pagination? There are some great tips in there that can help you implement this.
Also, View-all in search results & 5 common mistakes with rel=canonical from Google also has some tips.
Hope these help a bit! Let me know if you have any questions or comments! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap For Static Content And Blog
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently. How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed. I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
Technical SEO | | vikasnwu0 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
How to set up internal linking with subcategories?
I'm building a new website and am setting up internal link structure with subcategories and hoping to do so with best Seo practices in mind. When linking to a subcategory's main page, would I make the internal link www.xxx.com/fishing/ or www.xxx.com/fishing/index.html or does it matter? I'm just trying to avoid duplicate content I guess, if Google saw each page as a separate page. Any other cautions when using subdirectories in my navigation?
Technical SEO | | wplodge0 -
Duplicate Content
Crawl Diagnostics has returned several issues that I'm unsure how to fix. I'm guessing it's a canonical link issue but not entirely sure... Duplicate Page Content/Titles On a website (http://www.smselectronics.co.uk/market-sectors) with 6 market sectors but each pull the same 3 pages as child pages - certifications, equipment & case studies. On each products section where the page only shows X amount of items but there are several pages to fit all the products this creates multiple pages. There is also a similar pagination problem with the Blogs (auto generated date titles & user created SEO titles) & News listings. Blog Tags also seem to generate duplicate pages with the same content/titles as the parent page. Are these particularly important for SEO or is it more important to remove the duplication by deleting them? Any help would be greatly appreciated. Thanks
Technical SEO | | BBDCreative0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0