How far can I push rel=canonical?
-
My plan: 3 sites with identical content, yet--wait for it--for every article whose topic is A, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site A. For every article whose topic is B, the pages on all three sites posting that article will have a rel=canonical tag pointing to Site B.
So Site A will have some articles about topics A, B, and C. And for pages with articles about A, the rel=canonical will point to the page it's on. Yet for pages with articles about B, the rel=canonical will point to the version of that article on site B. Etc.
I have my reasons for planning this, but you can see more or less that I want each site to rank for its niche, yet I want the users at each site to have access to the full spectrum of articles in the shared articles database without having to leave a given site.
These would be distinct brands with distinct Whois, directory listings, etc. etc.
The content is quality and unique to our company.
-
I think I'd start slowly in that case. Keep the relationship aspect in mind, too. Even if all three companies know the writer/client and are aware of the relationship, sooner or later one of these articles is going to take off. If one site gets the SEO credit and the other two sites aren't ranking, there may be friction. Even if the work is spread out evenly and all high-quality, you don't control (ultimately) what content finally sticks and is successful. I just think things could get weird all-around if you send every article three places and only one gets credit.
-
These are technically different companies with different products, all of which are in the securities industry. They are each founded by different groups of individuals, however my client is common among them and happens to be a fantastic writer. Many of the articles would add value to the readers of some of the other sites. I am hoping to develop a common command center so that in the editor for a given article he is able to just check off which of his sites the article will be published at, and which is to be considered canonical. So the sites will have different aesthetics and navigation, product pages, and other company-specific content, and not every article will show up on every site, however many will show up at multiple sites.
The idea of phasing in common articles with the cross-domain canonical strikes me as wise, and then just noindexing the non-canonical versions if I run into trouble.
-
Ah, understood. So, yes, in theory cross-domain canonical does handle this. I know major newspapers that use it for true syndication. There is risk, though, depending on the sites and content, and there is a chance Google will ignore it (moreso than in-domain canonical). So, I mostly wanted you to be aware of those risks.
META NOINDEX is safer, in some respects (Google is more likely to honor it), but if people start linking to multiple versions of the content, then you may lose the value of those inbound links on the NOINDEX'ed content. Since it's not showing up in search results, that's less likely (in other words, people are going to be most inclined to link to the canonical version), but it's a consideration.
It's really tough to give a recommendation without understanding the business model, but if you absolutely have to have separate sites and you feel that this content is valuable to the visitors of all three sites, then cross-domain canonical is an option. It's just not risk-free. Personally, I'd probably start with unique content across the three domains, then phase in the most useful pieces as duplicates with canonical. Measure and see how it goes. Don't launch 1,000 duplicates on three sites in one day.
-
Budget not an issue, although skilled labor is.
-
Very helpful, thank you!
There is in fact a legal reason why the sites must be distinct from each other and strong marketing reasons why we do need more than one site.
I should mention that although the pages hosting the shared articles will be 99% identical, each site will have other content distinct from the others.
I am open to dropping my idea to share an article database between the sites and just having unique content on each, although I have to wonder what the use of cross-domain canonical is, if not to support this kind of article syndication.
-
Completely agree with dr Peter. If you really need to separate those domains it should be a really good reason.
In my past I used to have many EMD domain to get easy traffic thanks to the domain name boost in serps and so those sites were ranking without many efforts, but after google heading more towards brands this kind of strategy is really time and money consuming.
It really depends on how much budget you may spend on those sites, but normally consolidating the value in one bigger site is the best way to build a brand and achieve links and ranks nowadays.
-
I tend to agree - you always run the risk with cross-domain canonical that Google might not honor it, and the you've got a major duplicate content problem on your hands.
I think there's a simpler reason, in most cases, though. Three unique sites/brands take 3X (or more, in practice) the time and energy to promote, build links to, build social accounts for, etc. That split effort, especially on the SEO side, can far outweigh the brand benefits, unless you have solid resources to invest (read that "$$$").
To be fair, I don't know your strategy/niche, but I've just found that to be true 95% of the time in these cases. Most of the time, I think building sub-brands on sub-folders within the main site and only having one of each product page is a better bet. The other advantage is that users can see the larger brand (it lends credibility) and can move between brands if one isn't a good match.
The exception would be if there's some clear legal or competitive reason the brands can't be publicly associated. In most cases, though, that's going to come with a lot of headaches.
-
Hi all, I think that your alternatives would be:
- one big site with all the thematics. In that way all users can access all content without leaving the site, no need for noindex no need for canonicals since you won't have dupe content
- three sites with specialized articles in each one. You may change slightly your design to give the user the feeling that the site is different but in the same network. Then you may interlink those sites as useful resources. Not optimal since they'll have a huge interlinking,
- as you said noindex the non canonical article. Remember that the noindex tag will prevent indexation not crawling because google will need to crawl your page to know that it should not index it. So you may add meta "noindex,nocache,follow" in the header and be sure that the juice is still flowing in your site.
-
Hmm, ok that's helpful.
The content would be identical with the possible exceptions of a very slightly different meta title and site footer.
What's my alternative to a setup like this? One site, one brand? Noindex the non-canonical article versions?
What I dislike about noindex is that it means inbound links to the non-canonical article versions bring me no benefit.
-
I believe you are playing with fire here... to me this looks like you are trying to manipulate search engines.
If you read the article About rel="canonical" on Google Webmasters Support, you will see they say rel="canonical" link element is seen as a hint and not an absolute directive
Also in the same article they specify that rel="canonical" should be used on pages with identical content. Are you sure in your case the pages have identical content (per total) or just identical articles?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link rel=next and prev validator?
Can I validate link next and prev markup for paginated content?
Intermediate & Advanced SEO | | Evan340 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
New Google AdWords Keyword Tool - What Can We Do?
What options do we have for keyword research now that Google is switching from the Google AdWords Keyword Tool to the Keyword Planner??
Intermediate & Advanced SEO | | alhallinan0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
Can Linking Between Your Own Sites Excessively Be a Penguin No-No?
I have a bunch of travel-related sites that for a long time dominated google.com.au without any intensive SEO whatsoever. Aside from solid on-page content and meta tag, I did no link building. However, all of my sites are heavily interlinked, and I think they are linked with do follow links and lots of anchor texts. Here are a few of them: www.beautifulpacific.com www.beautifulfiji.com www.beautifulcooklands.com My idea in inter-linking them was to create a kind of branded "Beautiful" nexus of sites. However, when Penguin hit -- which I believe was on April 27th -- search traffic crashed, and has crashed over and over again. I've read that Penguin penalized over-optimization vis a vis anchor text links. I don't have a lot of inbound links like these, but they are everywhere among my sites. Is it possible that all of my text links have hurt me with Penguin? Thanks to everyone in advance for your time and attention. I really appreciate it. -Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
Canonical Not Fixing Duplicate Content
I added a canonical tag to the home page last month, but I am still showing duplicate content for the home page. Here is the tag I added: What am I missing? Duplicate-Content.jpg
Intermediate & Advanced SEO | | InnoInsulation0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0