Cross-Domain Canonical and duplicate content
-
Hi Mozfans!
I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
The thing is that the client has about 3 sites with the same Jobs on it.I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why.
Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A).Thanks!
Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday -
Every document I have seen all agrees that canonical tags are followed when the tag is used appropriately.
The tag could be misused either intentionally or unintentionally in which case it would not be honored. The tag is meant to connect pages which offer identical information, very similar information, or the same information presented in a different format such as a modified sort order, or a print version. I have never seen nor even heard of an instance where a properly used canonical tag was not respected by Google or Bing.
-
Thanks Ryan, I didn't noticed that about the reply sequencing, and you're right, I read them in the wrong order. It makes much more sense now.
By "some" support, I meant that even Google via Matt Cutts says that they don't take cross domain canonical as "a directive" but rather a "hint" (and even that assumes Google agrees with you, that your pages are duplicates).
So the magic question is how how much authority do Bing and Google give the rel="canonical" and is it similar between the two engines?
-
One aspect of the SEOmoz Q&A structure I dislike is the ordering of responses. Rather then maintaining a timeline order, the responses are re-ordered based on other factors such as "thumbs-up" and staff endorsements. I understand the concept that replies which are liked more are probably more helpful and should be seen first, but it causes confusion such as in this case.
Dr. Pete's response on the Bing cross-canonical topic appears first, but it was offered second-to-last chronologically speaking. We originally agreed there was not evidence indicating Bing supported the cross-canonical tag, then he located such evidence and therefore we agree Bing does support the tag.
The statement Dr. Pete shared was that "Bing does support cross-domain canonical". There was no limiting factor. I mention this because you said they offered "some" support and I am not sure why you used that qualifier.
-
Ryan, at the end o the thread you linked to, it seems like both Dr. Pete and yourself, agreed that there wasn't much evidence of bing support. Have you learned something that changed your mind?
I know a rep from Bing told Dr. Pete there was "some" support, but what does that mean? i.e. Exactly Identical sites pass a little juice/authority, or similar sites pass **a lot **juice/authority?
Take a product that has different brands in different parts of the country. Hellmanns's and Best Foods for example. They have two sites which are the same except for logos. Here is a recipe from each site.
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
The sites are nearly identical except for logo's/product names.
For the (very) long tail keyword "Mayonnaise Bobby Flay Waldorf salad wrap" Best Foods ranks #5 and Hellmann's ranks #11.
I doubt they have a SEO looking very close at the sites, because in addition to their duplicate content problem, neither pages has a meta description.
If the Hellmanns page had a
[http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1](http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1)"/>
I'd expect to see the Best Foods page move up and Hellmanns move down in Google. But would Bing appears to not like the duplicate pages as much, currently the Best Food version ranks #12 and the Hellmann doesn't rank at all. My own (imperfect tests) lead me to believe that adding the rel="canonical" would help in google but not bing.
Obviously, the site owner would probably like one of those two pages to rank very high for the unbranded keyword, but they would want both pages to rank well if I added a branded term. My experience with cross-domain canonical in Google lead me to believe that even the non-canonical version would rank for branded keywords in Google, but what would Bing do?
I'd be very cautious about relying on the cross-domain canonical in Bing until I see some PUBIC announcement that it's supported. ```
-
I was bit confused when i read that. You put my mind to rest !
-
My apologies Atul. I am not sure what I was thinking when I wrote that. Please disregard.
-
Thanks Ryan!
So it will be a Canonical tag
-
I would advise NOT using the robots.txt file if at all possible. In general, the robots.txt file is a means of absolute last resort. The main reason I use the robots.txt file is because I am working with a CMS or shopping cart that does not have the SEO flexibility to noindex pages. Otherwise, the best robots.txt file is a blank one.
When you block a page in robots.txt, you are not only preventing content from being indexed, but you are blocking the natural flow of page rank throughout your site. The link juice which flows to the blocked page dies on the page as crawlers cannot access it.
-
That is correct. If you choose to read the information directly from Google it can be found here:
-
Thanks!
It's for a site in the Netherlands and google is about 98% of the market. Bing is comming up so a thing to check.
No-roboting is a way to do it i didn't think about! thanks for that. I will check with the client.
-
Thanks Ryan!
So link is like:
On the site a i will use the canonical to point everything to site A.
-
You mean rel=author on site A ? How does it help ? Where should rel=author points to ?
-
According to Dr. Pete Bing does support cross-domain canonical.
If you disagreed I would first recommend using rel=author to establish "Site A" was the source of the article.
-
A cross-domain canonical will help with Google. (make sure the pages truely are duplicate or very close), however, I haven't found any confirmation yet that Bing supports Cross Domain Canonical.
If the other sites don't need to rank at all, you could also consider no-roboting the job pages on the other sites, so that your only Site A's job listings get indexed.
-
Yes. A cross-domain canonical would solve the duplicate content issue and focus on the main site's ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
Joomla duplicate content
My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance
Intermediate & Advanced SEO | | kuavicrea0 -
How to prevent duplicate content within this complex website?
I have a complex SEO issue I've been wrestling with and I'd appreciate your views on this very much. I have a sports website and most visitors are looking for the games that are played in the current week (I've studied this - it's true). We're creating a new website from scratch and I want to do this is as best as possible. We want to use the most elegant and best way to do this. We do not want to use work-arounds such as iframes, hiding text using AJAX etc. We need a solid solution for both users and search engines. Therefor I have written down three options: Using a canonical URL; Using 301-redirects; Using 302-redirects. Introduction The page 'website.com/competition/season/week-8' shows the soccer games that are played in game week 8 of the season. The next week users are interested in the games that are played in that week (game week 9). So the content a visitor is interested in, is constantly shifting because of the way competitions and tournaments are organized. After a season the same goes for the season of course. The website we're building has the following structure: Competition (e.g. 'premier league') Season (e.g. '2011-2012') Playweek (e.g. 'week 8') Game (e.g. 'Manchester United - Arsenal') This is the most logical structure one can think of. This is what users expect. Now we're facing the following challenge: when a user goes to http://website.com/premier-league he expects to see a) the games that are played in the current week and b) the current standings. When someone goes to http://website.com/premier-league/2011-2012/ he expects to see the same: the games that are played in the current week and the current standings. When someone goes to http://website.com/premier-league/2011-2012/week-8/ he expects to the same: the games that are played in the current week and the current standings. So essentially there's three places, within every active season within a competition, within the website where logically the same information has to be shown. To deal with this from a UX and SEO perspective, we have the following options: Option A - Use a canonical URL Using a canonical URL could solve this problem. You could use a canonical URL from the current week page and the Season page to the competition page: So: the page on 'website.com/$competition/$season/playweek-8' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would have a canonical tag that points to 'website.com/$competition/' The next week however, you want to have the canonical tag on 'website.com/$competition/$season/playweek-9' and the canonical tag from 'website.com/$competition/$season/playweek-8' should be removed. So then you have: the page on 'website.com/$competition/$season/playweek-9' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would still have a canonical tag that points to 'website.com/$competition/' In essence the canonical tag is constantly traveling through the pages. Advantages: UX: for a user this is a very neat solution. Wherever a user goes, he sees the information he expects. So that's all good. SEO: the search engines get very clear guidelines as to how the website functions and we prevent duplicate content. Disavantages: I have some concerns regarding the weekly changing canonical tag from a SEO perspective. Every week, within every competition the canonical tags are updated. How often do Search Engines update their index for canonical tags? I mean, say it takes a Search Engine a week to visit a page, crawl a page and process a canonical tag correctly, then the Search Engines will be a week behind on figuring out the actual structure of the hierarchy. On top of that: what do the changing canonical URLs to the 'quality' of the website? In theory this should be working all but I have some reservations on this. If there is a canonical tag from 'website.com/$competition/$season/week-8', what does this do to the indexation and ranking of it's subpages (the actual match pages) Option B - Using 301-redirects Using 301-redirects essentially the user and the Search Engine are treated the same. When the Season page or competition page are requested both are redirected to game week page. The same applies here as applies for the canonical URL: every week there are changes in the redirects. So in game week 8: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' the page on 'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' A week goes by, so then you have: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' the page on 'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' Advantages There is no loss of link authority. Disadvantages Before a playweek starts the playweek in question can be indexed. However, in the current playweek the playweek page 301-redirects to the competition page. After that week the page's 301-redirect is removed again and it's indexable. What do all the (changing) 301-redirects do to the overall quality of the website for Search Engines (and users)? Option C - Using 302-redirects Most SEO's will refrain from using 302-redirects. However, 302-redirect can be put to good use: for serving a temporary redirect. Within my website there's the content that's most important to the users (and therefor search engines) is constantly moving. In most cases after a week a different piece of the website is most interesting for a user. So let's take our example above. We're in playweek 8. If you want 'website.com/$competition/' to be redirecting to 'website.com/$competition/$season/week-8/' you can use a 302-redirect. Because the redirect is temporary The next week the 302-redirect on 'website.com/$competition/' will be adjusted. It'll be pointing to 'website.com/$competition/$season/week-9'. Advantages We're putting the 302-redirect to its actual use. The pages that 302-redirect (for instance 'website.com/$competition' and 'website.com/$competition/$season') will remain indexed. Disadvantages Not quite sure how Google will handle this, they're not very clear on how they exactly handle a 302-redirect and in which cases a 302-redirect might be useful. In most cases they advise webmasters not to use it. I'd very much like your opinion on this. Thanks in advance guys and galls!
Intermediate & Advanced SEO | | StevenvanVessum0 -
I'm not sure why SEOMoz is reporting duplicate content
I have thousands of duplicate page content errors on my site, but I'm not exactly sure why. For example, the crawl is reporting this page - http://www.fantasytoyland.com/2011-glee-costumes.html is a duplicate of this page - http://www.fantasytoyland.com/2011-jersey-shore-costumes.html . All of these products are unique to the page - what is causing it to flag as duplicate content?
Intermediate & Advanced SEO | | FutureMemoriesInc0