Cross Domain duplicate content...
-
Does anyone have any experience with this situation?
We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly.
Question 1:
If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input.
The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals.
Question 2
Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex:
Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine.
Question 3:
These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
-
If the alternative is just de-indexing those duplicate pages on one website, then I'd definitely recommend the cross-domain canonicals, yes.
-
Brady:
Thanks for your advice. We are de-indexing as a test to see if our rankings are somehow being constrained because of duplicate content. Our rankings are not reacting as they should to our link building efforts, and I believe that duplicate content is the issue. With this test, I am trying to understand how Google sees and connects these 2 sites. If Google does connect the sites, then we must canonical since Google won't let us have 2 slots on page 1 for the same KW. If Google doesn't connect the sites, then we can theoretically get 2 listings on page 1 if our content is unique.
My hypothesis is that our massive duplicate content is having a negative impact on rankings. Google might be hitting us with a minor Panda slap, or the competing content is somehow hurting us algorithmicly. If we deindex a competing page, if we are being hurt by the algo, the remaining page should make a bump up.
I am pretty certain that canonicals will have a positive impact on our rankings. The question I am testing for is "do we have to canonical"? If we don't, then we have a decision to make - do we try to rank both sites for a KW, or canonical and focus on 1.
-
First of all, these are great questions.
My first question would be are the sites hosted on the same server or near-same IP address? If they are, and given much of the content is duplicate, chances are Google/search engines already understand these websites are somehow related. This is just something to consider...
Answer #1: If you de-index a group of products on one website, chances are, yes the other site would see some improvement just based on there being one less competitor. But I would do some extensive competitive research first to see if how the other sites are ranking next to your two sites.
Ultimately, I would side with a cross-domain canonical over de-indexing that way you're passing some value from one site to the other. I would do this on a product by product basis however, making sure the product niche you keep indexed matches with the site's overall niche theme and not vis versa.
Answer #2: My second paragraph sort of addresses your second question. Think from a semantic and topical understanding perspective here: if it's a healthcare product, make sure the site with the healthcare niche is the one being indexed, not just the general merchandise website. Even simply from a branding and general marketing perspective that makes more sense, IMO.
Answer #3: It sounds like, if there's duplicate descriptions (I'm guessing images, headers, and other content pieces) the canonicals likely would be honored. Even across domains, duplicate content can be a concern (especially if they're hosted together). Remember though, canonical tags are just a mere suggestion, so it could take some time for Google to honor it, but from the information you've given, I think that's your best shot.
Another thing to take into consideration when using canonical tag: make sure you're placing the canonical tag on the page/website that's performing worse of the two. There may be exceptions based on the niche and from a semantic perspective, but in general, you don't want to hurt your own performance by referencing the less authoritative page.
Good luck! Hope my advice helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicating relevant category content in subcategories. Good or bad for google ranking?
In a travel related page I have city categories with city related information.
Intermediate & Advanced SEO | | lcourse
Would you recommend for or against duplicating some relevant city related in subcategory pages. For visitor it would be useful and google should have more context about the topic of our page.
But my main concern is how this may be perceived by google and especially whether it may make it more likely being penalized for thin content. We already were hit end of june by panda/phantom and we are working on adding also more unique content, but this would be something that we could do additionally and basically instantaneously. Just do not want to make things worse.0 -
Reinforcing Rel Canonical? (Fixing Duplicate Content)
Hi Mozzers, We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint. Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up? Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
Intermediate & Advanced SEO | | Travis-W0 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Should 301 Redirects be used only in cross domains or also internally?
In the following video with Cutts: http://youtu.be/r1lVPrYoBkA he explains a bit more about 301 redirects but he only talks about cross sites. What about redirecting internally from a non-existing product in a store to a new similar existing product?
Intermediate & Advanced SEO | | BeytzNet0 -
Press Release and Duplicate Content
Hello folks, We have been using Press Releases to promote our clients business for a couple of years and we have seen great results in referral traffic and SEO wise. Recently one of our clients requested us to publish the PR on their website as well as blast it out using PRWeb and Marketwire. I think that this is not going to be a duplicate content issue for our client's website since I believe that Google can recognize which content has been published first, but I will be more than happy to get some of the Moz community opinions. Thank you
Intermediate & Advanced SEO | | Aviatech0 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0