Is it best practice to have a canonical tags on all pages
-
The website I'm working on has no canonical tags. There is duplicate content so rel=canonicals need adding to certain pages but is it best practice to have a tag on every page ?
-
ColesNathan,
Have you seen what Google has to say about canonicals? https://support.google.com/webmasters/answer/139066?hl=en
You might find it helpful. They list reasons why you might want to use a canonical tag including those identified above and a few others, for example, for letting Google know your priorities when it comes to crawl budget and SERP display.
Canonicals can also help undermine plagiarism. If scrapers leave your self-referencing canonical intact, it will tell Google you are the originator of that content and consolidate link signals into your URL.
-
Correct, I would usually advise adding in a self-referencing canonical tag to make it easier for audits and search engines to understand what the actual content is on the page.
-
Hi, thanks for getting back to me. Ok so you don't need a canonical on every page unless its required.
However, for tracking purposes it is good practice to have one set up on all pages?
Have I got that right ?
Nathan
-
Hi there!
It is a best practice as long as you have a CMS or any system that allows you to control them. In the case of Wordpress with YOAST it is pretty easy to set up.
Why would you want a canonical on every page? This is useful when you have different campaigns and you use url parameters for tracking (the common _utm from Google analytics), and that is correctly taken care of with wordpress+YOAST.If you see that you dont need canonicals on any page, then dont use it. Just be sure that when generating parameters o duplicate content, there is a canonical.
Hope i´ve helped.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
How best to deindex tens of thousands of pages?
Hi there, We run a quotes based site and so have hundreds of thousands of pages. We released a batch of pages (around 2500) and they ranked really well. Encouraged by this we released the remaining ~300,000 pages in just a couple of days. These have been indexed but are not ranking any where. We presume this is because we released too much too quickly. So we want to roll back what we've done and release them in smaller batches. So I wondered if: 1. Can we de-index thousands of pages, and if so what's the best way of doing this? 2. Can we then re-index these pages but over a much greater time period without changing the pages at all - or would we need to change the pages/the URL's etc? thanks! Steve
Intermediate & Advanced SEO | | SteveW19870 -
Canonical Tag help
Hello everyone, We have implemented canonical tag on our website: http://www.indialetsplay.com/ For e.g. on http://www.indialetsplay.com/cycling-rollers?limit=42 we added canonical as http://www.indialetsplay.com/cycling-rollers?limit=all (as it showcase all products) Our default page is http://www.indialetsplay.com/cycling-rollers Is canonical tag implementation right? Or we need to add any other URL. Please suggest
Intermediate & Advanced SEO | | Obbserv0 -
Best practices for structuring an ecommerce site
I'm revamping my wife's ecommerce site. It is currently a very low traffic website that is not indexed very well in Google. So, my plan is to restructure it based upon the best practices that helps me avoid duplicate content penalties, and easier to index strategies. The store has about 7 types of products. Each product has approximately 30 different size variations that are sometimes specifically searched for. For example: 20x10x1 air filters, 20x10x2 air filters, 20x10x1 allergy reducing air filters, etc So, is it best for me to create 7 different products with 30 different size variations (size selector at the product level that changes the price) or is it better to create 210 different product pages, one for each style/size?
Intermediate & Advanced SEO | | pherbio0 -
SEO best practices for embedding content in a map
My company is working on creating destination guides for families exploring where to go on their next vacation. We've been creating and promoting content on our blog for quite some time in preparation for the map-based discovery. The UX people in my company are pushing for design/functionality similar to:
Intermediate & Advanced SEO | | Vacatia_SEO
http://sf.eater.com/maps/the-38-essential-san-francisco-restaurants-january-2015 From a user perspective, we all love this, but I'm the SEO guy and I'm having a hard time figuring out the best way to guide my team regarding getting readers to the actual blog article from the left content area. The way they want to do it is to have the content displayed overtop the map when someone clicks on a pin. Great, but there's no way for me to optimize the map for every article. After all, if we have an article about best places to snorkel on Maui, I want Google to direct people to the blog article specific to that search term because that page is the authority on that subject. Additionally, the map page itself will have no original content because it will be pulling all the blog content from other URLS, which will get no visitors if people read on the map. We also want people, when they find an article they like, to be able to copy a URL to share. If the article is housed on the map page, the URL will be ugly and long (not SEO friendly) based on parameters from the filters the visitor used to drill down to that article. So I don't think I can simply optimize the map filtered-URL. Can I? The others on my team do not want visitors to ping pong back and forth between map and article and would prefer people stay on the discovery map. We did have a thought that we'd give people an option to click a link to read the article off the map but I doubt people will do it which means that page will never been visited, thus crushing it's page rank. so questions: How can i pass link juice/SEO love from the map page to the actual blog article while keeping the user on the map? Does google pass that juice if you use Iframes? What about doing ajax calls? Anyone have experience doing this? Am I making a mountain out of a molehill? Should I trust that if I create good content, good UX and allow people to explore how they prefer, Google will give me the love? Help me Rand Fishkin, you're my only hope!1 -
Using unique content from "rel=canonical"ized page
Hey everyone, I have a question about the following scenario: Page 1: Text A, Text B, Text C Page 2 (rel=canonical to Page 1): Text A, Text B, Text C, Text D Much of the content on page 2 is "rel=canonical"ized to page 1 to signalize duplicate content. However, Page 2 also contains some unique text not found in Page 1. How safe is it to use the unique content from Page 2 on a new page (Page 3) if the intention is to rank Page 3? Does that make any sense? 🙂
Intermediate & Advanced SEO | | ipancake0 -
Canonical Tags?
I read that Google will "honor" these tags if your website has two url's with duplicate content. The duplicate content does not show up in my SEOmoz crawls report but they do in the search engines and many of "non authoritative links" that are generated from my search feature j(ugly url's with % ...not real user friendly) are ranking higher than the "good URL" links. So if I do the canonical tags I guess my higher ranking bad urls will drop. I even read that google might even completely overlook the links. I read somewhere that the best way to do this is with a 301 redirect...is that correct? I m ranking pretty good with my main keyword terms so I am afraid to make changes not knowing the effect. Any suggestions? Thanks, Boo
Intermediate & Advanced SEO | | Boodreaux0 -
How to Disallow Tag Pages With Robot.txt
Hi i have a site which i'm dealing with that has tag pages for instant - http://www.domain.com/news/?tag=choice How can i exclude these tag pages (about 20+ being crawled and indexed by the search engines with robot.txt Also sometimes they're created dynamically so i want something which automatically excludes tage pages from being crawled and indexed. Any suggestions? Cheers, Mark
Intermediate & Advanced SEO | | monster990