Cross Domain duplicate content...
-
Does anyone have any experience with this situation?
We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly.
Question 1:
If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input.
The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals.
Question 2
Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex:
Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine.
Question 3:
These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
-
If the alternative is just de-indexing those duplicate pages on one website, then I'd definitely recommend the cross-domain canonicals, yes.
-
Brady:
Thanks for your advice. We are de-indexing as a test to see if our rankings are somehow being constrained because of duplicate content. Our rankings are not reacting as they should to our link building efforts, and I believe that duplicate content is the issue. With this test, I am trying to understand how Google sees and connects these 2 sites. If Google does connect the sites, then we must canonical since Google won't let us have 2 slots on page 1 for the same KW. If Google doesn't connect the sites, then we can theoretically get 2 listings on page 1 if our content is unique.
My hypothesis is that our massive duplicate content is having a negative impact on rankings. Google might be hitting us with a minor Panda slap, or the competing content is somehow hurting us algorithmicly. If we deindex a competing page, if we are being hurt by the algo, the remaining page should make a bump up.
I am pretty certain that canonicals will have a positive impact on our rankings. The question I am testing for is "do we have to canonical"? If we don't, then we have a decision to make - do we try to rank both sites for a KW, or canonical and focus on 1.
-
First of all, these are great questions.
My first question would be are the sites hosted on the same server or near-same IP address? If they are, and given much of the content is duplicate, chances are Google/search engines already understand these websites are somehow related. This is just something to consider...
Answer #1: If you de-index a group of products on one website, chances are, yes the other site would see some improvement just based on there being one less competitor. But I would do some extensive competitive research first to see if how the other sites are ranking next to your two sites.
Ultimately, I would side with a cross-domain canonical over de-indexing that way you're passing some value from one site to the other. I would do this on a product by product basis however, making sure the product niche you keep indexed matches with the site's overall niche theme and not vis versa.
Answer #2: My second paragraph sort of addresses your second question. Think from a semantic and topical understanding perspective here: if it's a healthcare product, make sure the site with the healthcare niche is the one being indexed, not just the general merchandise website. Even simply from a branding and general marketing perspective that makes more sense, IMO.
Answer #3: It sounds like, if there's duplicate descriptions (I'm guessing images, headers, and other content pieces) the canonicals likely would be honored. Even across domains, duplicate content can be a concern (especially if they're hosted together). Remember though, canonical tags are just a mere suggestion, so it could take some time for Google to honor it, but from the information you've given, I think that's your best shot.
Another thing to take into consideration when using canonical tag: make sure you're placing the canonical tag on the page/website that's performing worse of the two. There may be exceptions based on the niche and from a semantic perspective, but in general, you don't want to hurt your own performance by referencing the less authoritative page.
Good luck! Hope my advice helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
301ing Pages & Moving Content To Many Other Domains
Recently started working with a large site that, for reasons way beyond organic search, wants to forward internal pages to a variety of external sites. Some of these external sites that would receive the content from the old site are owned, admin'd and/or hosted by the old site, most are not. All of the sites receiving content would be a better topic fit for that content than the original site. The process is not all at once, but gradual over time. No internal links on the old site to the old page or the new site/url would exist post content move and 301ing. The forwarding is mostly to help Google realize the host site of this content is not hosting duplicate content, but is the one true copy. Also, to pick up external links to the old pages for the new host site. It's a little like domain name change, but not really since the old site will continue to exist and the new sites are a variety of new/previously existing sites that may or may not share ownership/admin etc. In most cases, we won't be able to change any external link pointing to the original site and will just be 301ing the old url to the contents new home on another site. Since this is pretty unusual (like I wouldn't get up in the morning and choose to do this for the heck of it), here are my three questions: Is there any organic search risk to the old site or the sites receiving the old content/301 in this maneuver? Will the new sites pick up the link equity benefit on pages that had third party/followed links continuing to point to the old site but resolving via the 301 to this totally different domain? Any other considerations? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Cross Domain Rel Canonical for Affiliates?
Hi We use the Cross Domain Rel Canonical for duplicate content between our own websites, but what about affiliates sites who want our XML feed, (descriptions of our products). We don´t mind being credited but would this present a danger for us? Who is controlling the use of that cross domain rel canonical, us in our feed or them? Is there another way around it?
Intermediate & Advanced SEO | | xoffie0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0