Set Canonical for Paginated Content
-
Hi Guys,
This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors#
I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible.To point Google in the right direction I want to let them know that "View All" is the best way to go.
I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website.Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
Thanks for the help!Joost
-
Joost - that's correct! Yes, I assume woocommerce, since they are product pages.
-
Hi Dan,
Thanks for the explanation.
Ok so I block 24 and 48 for Google but users can still use them to navigate through the site.
I assume this is woocommerce related because Woocommerce creates the output for the productpages right?Thanks again!
Joost
-
Joost
I think you'll need to get a developer or someone involved to help execute, but here's the ideal scenerio:
- Add meta "noindex" tags to ?show_products=24 and 48
- Make your 'view all' URL ideally just /product-category/t-shirts/ - with no parameter - or if you have to, maybe /t-shirts/all/ - your goal here is to keep it consistent and NOT the same parameter as the other pages
- Then, whatever consistent URL you have for the 'all' - don't add "noindex" to that (keep it indexable).
- Wait for Google to remove 24/48 URLs from the index (you have to just check every week or two with site: searches)
- Once they are noindexed, block crawling with robots.txt with this line:
Disallow: /?show_products= <---but ONLY use that if you've changed your 'view all' URLs to something else! You ideally want a different URL structure for 'view all' vs. not view all to control crawling and indexation more easily.
-
Hi Dan,
Thanks for your reply.
For the category t-shirt I've got this:/product-category/t-shirts/?show_products=24 (24)
/product-category/t-shirts/?show_products=48 (48)
/product-category/t-shirts/?show_products=41 (when All selected)Let me know! And thanks again for your time! Really apreciate it!Joost
-
Hi Joost
Can you provide examples of how all your URLs are setup? What does the URL look like for view all, 24 items etc etc?
-
Wow Dan!
Thanks for looking in to this!
I assume you are totally right but have no idea how I should implement this strategy on my site. It just a plain wordpress install with woocommerce. I use Yoast (ofcourse) but never went in-depth with robot.txt.
How can I provide you with more info? Or better; myself
Thanks again,
Joost
-
Hi Joost
It would be better to just "noindex" anything except view all. Then once they are gone from the index, set a block in robots.txt so they can't be crawled anymore. That fixes the issue at the source, the canonical is more of a bandaid. So:
1. Add a meta "noindex" tag to everything except view all (I am not 100% sure how in your wordpress setup - there's no one way, it depends on your setup).
2. Monitor the indexation of these pages in Google and wait for them to be removed (you can check with just searching for the URL in the search bar).
3. Once they are all gone from the index, block crawlers from accessing them by adding a line to your robots.txt file blocking the 24/48 URLs - again, I don't know the exact code for your robots.txt because I am unsure of your URL setup, but a dev or someone can help - or feel free to write back with these details and I'll try to help further.
-
Hi Patrick,
Thanks for helping out. I've read a lot about the theory behind View All and why & when it's better to set canonicals on page 2 and 3 to View All.
But I can't seem to find any information how to implement the rel canonical in wordpress/woocommerce.I know that Google will try to sort it out by itself (if View All) is available but helping them with a canonical will solve a lot of 404 crawls on our site.
Any ideas?Joost
-
Hi Joost
Did you happen to take a look at SEO Guide to Google Webmaster Recommendations for Pagination? There are some great tips in there that can help you implement this.
Also, View-all in search results & 5 common mistakes with rel=canonical from Google also has some tips.
Hope these help a bit! Let me know if you have any questions or comments! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Rel=canonical overkill on duplicate content?
Our site has many different health centers - many of which contain duplicate content since there is topic crossover between health centers. I am using rel canonical to deal with this. My question is this: Is there a tipping point for duplicate content where Google might begin to penalize a site even if it has the rel canonical tags in place on cloned content? As an extreme example, a site could have 10 pieces of original content, but could then clone and organize this content in 5 different directories across the site each with a new url. This would ultimately result in the site having more "cloned" content than original content. Is this at all problematic even if the rel canonical is in place on all cloned content? Thanks in advance for any replies. Eric
Technical SEO | | Eric_Lifescript0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0