Is your live site supposed to have rel canonical tags?
-
I recently started working for a company and got them to use Moz and I have found that our secure site and our live sites are creating "duplicate content" according to the Crawl Diagnostics feature. On our secure site we have rel canonical tags pointing to our live site. I'm not super familiar with rel canonical tags, but our developer says we're doing the right thing. Would love any insight you guys may have if this is actually duplicate content or not. Thanks so much!
-
Agree with Dave's comments. 1) Get the syntax updated on your canonical links at a minimum. 2) Yes your canonical solution will "work", but it is not best practice. This "solution" is really a last resort. I would try and push to move away from using canonicals this way. You optimally want 1 URL.
Just to add some color, a great / classic video on this was made by Matt Cutts. He gives all kinds of examples where you could have duplicate URLs, i.e. www vs non www subdomain, sorting parameters added onto the URL, different file extensions, capitalization changes, etc. He then gives 3 options to fix them.
-
Best practice: Fix your site where you only have one URL per content item and link to it consistently (Best solution)
-
Use 301 redirects to consolidate to one URL (Next best solution)
-
Use a canonical link, if you cannot do 1 or 2. (Last resort)
Note that Matt says that they treat a canonical as a strong suggestion (it is treated similar to a 301), but they do not always have to follow it. He repeatedly says, use the first two options, and would NOT recommend a canonical as your best or first option.
My favorite quote is at 2:24 in the video, "Developers keep SEOs in business"
What your developer may notice is that Matt does say that using a canonical link for consolidating http and https will work. No one here would say that it would not, it is just not optimal. Sure, you can use a pair of scissors to cut your lawn, "it will work". It doesn't mean it's the best idea. I would think any developer worth his/her salt would want to have "clean code" and having duplicate URLs is not "clean" by SEO standards
Ok, so now you need to go back to the developer or your manager with an argument that is stronger than just, "Well, some random dude on the Moz forum said that Matt Cutt's from Google said it was preferred not to use a canonical link even though it would work". I would never want to leave you in such a position. Here is what will/can happen over time if you stay with your current setup.
-
Report consolidation issues. When you look at GA for traffic or OSE for links, any spidering tool for technical issues, social sharing counts, you now have split data for any given page potentially. Sure there are ways around this, but now you have to spend all your time "fixing" reports that should not be broken to start with. Trust me, this will come back to bite you on the bum and will cripple your efforts to show the efficacy of your SEO work. Now who really wants that?
-
Link juice consolidation issues. With any redirect - you lose a bit of link juice. If you have links to both sets of URLs, any single page is not getting as much credit as it should.
-
Down the line 301 redirect bloat. If you ever change anything and need to setup a 301 redirect, now you have to setup 2 of them and having too many 301s can negatively impact server performance.
One last thing. If you can get the URLs consolidated into one using 301s etc. Go with the https That is the way that we are headed with the web and so you might as well get going in that direction.
Good luck!
-
-
I really appreciate the response and the added information. I guess we will see if anyone else responds!
-
I'd be interested in hearing what someone else has to say about the way the canonicals are coded. You're doing yours similar to the way I do DNS Prefetching with the double slash to start the URL:
It works fine with prefetching as all the browser needs to do is find the IP of the domain but I'm not sure here how it'll handle sub-directories including www and I hate variables even when they're "it should work". The more common way to canonicalize your secured page would be:
/>
I'd be interested to hear if anyone has any direct experience with this but at the core of technical SEO issues I always lean to "most common usage" and "how Google shows it in their examples" just to make sure there is minimal chance of hiccups or issues.
That aside though, the developer is right though I'd always still prefer to just see the pages at a single URL. Since that can't be done however ... canonicals are the way to go.
-
That is correct! Here is an example of two URL's of what i'm talking about:
http://www.agroup.com/blog/5-signs-of-a-good-clientagency-relatoinship
https://agrouptt4.secure2.agroup.com/blog/5-signs-of-a-good-clientagency-relatoinshipDoes this help clarify my question? I hope so!
-
I'm not sure I entirely understand the scenario so let me note how I'm hearing it to make sure my understanding is correct to put the answer into context. Please do let me know if my understanding of the scenario is wrong as that may well change my thoughts on it.
You note that your secure site and live site are creating duplicate content. Of course a secure site can be live but I'm taking this to mean you have an area behind a login. That it's creating duplicate content is making me think that a lot of the core information is the same and I'm guessing many of the same pages.
If this is all correct and you can't put the duplicated pages onto one URL only then the canonicals are the way to go and your developer is correct.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Moz is showing old results for site craw issues
Moz is showing old results for site craw issues, the ondemand craw works but I can't fix these issues without knowing what causing them.
Moz Pro | | paulwildweb0 -
Is there a way for me to find out how a keyword would rank if it were on a specific site?
Is there a way for me to find out how a keyword would rank if it were on a specific site? For example, lets say that XYZ.com does not have the keyword "ABC". Is there a way for me to find out how the keyword "ABC" would rank if it were on XYZ.com?
Moz Pro | | TurboH0 -
SEOmoz link report VS. open site explorer
Hi, I run a campaign on one of my new clients in the links report - i see 1970 - external links and i can press " see more in open site explorer) when i press the button, open site explorer is opened but with a message that there is no link data on this website any advice? Are you familiar with another tool that can help me investigate links to website? Thank you SEOwise
Moz Pro | | iivgi0 -
Open Site Explorer.. Bit of a let down?
Hi, not too sure if this is a discussion or rant!? I’ve been following SEOMoz for a couple of years now. Testing their tools, reading their blogs, sharing their content, and watching the whiteboard Friday videos (massive thumbs up to that one!). They are at the top of the game, no argument there. Although I should have done so much earlier, I have eventually signed up to the pro version and am about to migrate all my clients over. But there is one major caveat which I’m not sure should exist.. Open Site Explorer. Don’t get me wrong, OSE provides invaluable metrics, some that no other ‘crawler’ provides, but it is far from perfect, there are things that we SEO’s need, and are hoping to get! Some feature SEOMoz could add; Ability to view a chart of ‘link types’ (i.e. blog post, social media, Press Release etc..) – Linkdex do this! Utilise a ‘fresh‘ backlink index as Majestic SEO do. ( we do use majestic alongside SEOMoz) Crawl more frequently – enough said! Index all ‘not so good’ backlinks – this will help identify what backlinks AREN’T helping. I realise this is a lot easier said than done, and I’m sure SEOMoz are working on solutions.. just can’t wait till they launch! What do you think? IS OSE more than it’s cracked up to be? Could it be improved? Let me know 🙂 Lee
Moz Pro | | Webpresence0 -
How to Link build for our site
Hello, A client of mine has a site that gives advice to people. It's moderately competitive. It's a content site. Using Open Site explorer, I looked up competitor's top pages and backlinks, and there were things such as sites that were heavy on directories, sites heavily on editorial mension, sites on juicy topics that I don't want to cover, and the rest was pretty much junk. Just so that you know, I only link build with quality websites that look strong, long term, have a nice design, are high quality, and are not a directory (with a few exceptions) What's the standard way to link build with a site like this? We've got 7 articles of good content that is from me spending hours learning about the subject and using my skills as a life coach. Right now I'm just adding quality content and social media. I am not a top expert in the field, but I'm learning as much as possible. How do we link build?
Moz Pro | | BobGW0 -
Do crawl reports see canonical tags?
Greetings, I just redesigned my site, www.funderstanding.com, and have the old site pointing to the new site via canonical URLs. I had a new crawl test run and it showed a large amount of duplicate content. Does the SEO Moz crawl tool validate canonical urls and adjusts the duplicate content count or is this note considered? FYI, I sent from no duplicate content to having 865 errors since the redesign went up so that seems suspicious. I would think though that assuming the canonical tag were used properly, and I hope it is?, that this would not be a problem?? All help with this is most appreciated. Eric
Moz Pro | | Ericc220 -
Backlinks using Open Site Explorer
Hey guys, So I was doing some backlink analysis for a client and was utilizing Open Site Explorer, which stated that I had roughly 2,111 External Followed Links. However, I can't seem to find a way to export all 2,111 backlinks. I wanted to make sure if the links I had acquired had been indexed and crawled, and picked up, but I can't seem to access the full list on a domain level, only on the subdomain level. Anybody have any suggestions or advice? Any would be helpful! Best, James
Moz Pro | | sixspokemedia0