Cross-Domain Canonical and duplicate content
-
Hi Mozfans!
I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
The thing is that the client has about 3 sites with the same Jobs on it.I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why.
Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A).Thanks!
Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday -
Every document I have seen all agrees that canonical tags are followed when the tag is used appropriately.
The tag could be misused either intentionally or unintentionally in which case it would not be honored. The tag is meant to connect pages which offer identical information, very similar information, or the same information presented in a different format such as a modified sort order, or a print version. I have never seen nor even heard of an instance where a properly used canonical tag was not respected by Google or Bing.
-
Thanks Ryan, I didn't noticed that about the reply sequencing, and you're right, I read them in the wrong order. It makes much more sense now.
By "some" support, I meant that even Google via Matt Cutts says that they don't take cross domain canonical as "a directive" but rather a "hint" (and even that assumes Google agrees with you, that your pages are duplicates).
So the magic question is how how much authority do Bing and Google give the rel="canonical" and is it similar between the two engines?
-
One aspect of the SEOmoz Q&A structure I dislike is the ordering of responses. Rather then maintaining a timeline order, the responses are re-ordered based on other factors such as "thumbs-up" and staff endorsements. I understand the concept that replies which are liked more are probably more helpful and should be seen first, but it causes confusion such as in this case.
Dr. Pete's response on the Bing cross-canonical topic appears first, but it was offered second-to-last chronologically speaking. We originally agreed there was not evidence indicating Bing supported the cross-canonical tag, then he located such evidence and therefore we agree Bing does support the tag.
The statement Dr. Pete shared was that "Bing does support cross-domain canonical". There was no limiting factor. I mention this because you said they offered "some" support and I am not sure why you used that qualifier.
-
Ryan, at the end o the thread you linked to, it seems like both Dr. Pete and yourself, agreed that there wasn't much evidence of bing support. Have you learned something that changed your mind?
I know a rep from Bing told Dr. Pete there was "some" support, but what does that mean? i.e. Exactly Identical sites pass a little juice/authority, or similar sites pass **a lot **juice/authority?
Take a product that has different brands in different parts of the country. Hellmanns's and Best Foods for example. They have two sites which are the same except for logos. Here is a recipe from each site.
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
The sites are nearly identical except for logo's/product names.
For the (very) long tail keyword "Mayonnaise Bobby Flay Waldorf salad wrap" Best Foods ranks #5 and Hellmann's ranks #11.
I doubt they have a SEO looking very close at the sites, because in addition to their duplicate content problem, neither pages has a meta description.
If the Hellmanns page had a
[http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1](http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1)"/>
I'd expect to see the Best Foods page move up and Hellmanns move down in Google. But would Bing appears to not like the duplicate pages as much, currently the Best Food version ranks #12 and the Hellmann doesn't rank at all. My own (imperfect tests) lead me to believe that adding the rel="canonical" would help in google but not bing.
Obviously, the site owner would probably like one of those two pages to rank very high for the unbranded keyword, but they would want both pages to rank well if I added a branded term. My experience with cross-domain canonical in Google lead me to believe that even the non-canonical version would rank for branded keywords in Google, but what would Bing do?
I'd be very cautious about relying on the cross-domain canonical in Bing until I see some PUBIC announcement that it's supported. ```
-
I was bit confused when i read that. You put my mind to rest !
-
My apologies Atul. I am not sure what I was thinking when I wrote that. Please disregard.
-
Thanks Ryan!
So it will be a Canonical tag
-
I would advise NOT using the robots.txt file if at all possible. In general, the robots.txt file is a means of absolute last resort. The main reason I use the robots.txt file is because I am working with a CMS or shopping cart that does not have the SEO flexibility to noindex pages. Otherwise, the best robots.txt file is a blank one.
When you block a page in robots.txt, you are not only preventing content from being indexed, but you are blocking the natural flow of page rank throughout your site. The link juice which flows to the blocked page dies on the page as crawlers cannot access it.
-
That is correct. If you choose to read the information directly from Google it can be found here:
-
Thanks!
It's for a site in the Netherlands and google is about 98% of the market. Bing is comming up so a thing to check.
No-roboting is a way to do it i didn't think about! thanks for that. I will check with the client.
-
Thanks Ryan!
So link is like:
On the site a i will use the canonical to point everything to site A.
-
You mean rel=author on site A ? How does it help ? Where should rel=author points to ?
-
According to Dr. Pete Bing does support cross-domain canonical.
If you disagreed I would first recommend using rel=author to establish "Site A" was the source of the article.
-
A cross-domain canonical will help with Google. (make sure the pages truely are duplicate or very close), however, I haven't found any confirmation yet that Bing supports Cross Domain Canonical.
If the other sites don't need to rank at all, you could also consider no-roboting the job pages on the other sites, so that your only Site A's job listings get indexed.
-
Yes. A cross-domain canonical would solve the duplicate content issue and focus on the main site's ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When domain a buys domain b (whose links direct to c), does domain a has links redirecting to domain c ?
Hi, I really need to know what happens when a company or domain (a) acquires another company with domain (b) with its links pointing to yet another location (c). Does company a then have redirects to c?
Intermediate & Advanced SEO | | Yeshourun0 -
Duplicate without user-selected canonical excluded
We have pdf files uploaded in the media of wordpress and used in our website. As these pdfs are duplicate content of the original publishers, we have marked links to these pdf urls as nofollow. These pages are also disallowed in robots.txt Now, Google Search Console has shown these pages Excluded as "Duplicate without user-selected canonical" As it comes out we cannot use canonical tag with pdf pages so as to point to the original pdf source If we embed a pdf viewer in our website and fetch the pdfs by passing the urls of the original publisher, would the pdfs be still read as text by google and again create duplicate content issue? Another thing, when the pdf expires and is removed, it would lead to 404 error. If we direct our users to the third party website, then it would add up to our bounce rate. What should be the appropriate way to handle duplicate pdfs? Thanks
Intermediate & Advanced SEO | | dailynaukri1 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Is an RSS feed considered duplicate content?
I have a large client with satellite sites. The large site produces many news articles and they want to put an RSS feed on the satellite sites that will display the articles from the large site. My question is, will the rss feeds on the satellite sites be considered duplicate content? If yes, do you have a suggestion to utilize the data from the large site without being penalized? If no, do you have suggestions on what tags should be used on the satellite pages? EX: wrapped in tags? THANKS for the help. Darlene
Intermediate & Advanced SEO | | gXeSEO0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0