Cross Domain duplicate content...
-
Does anyone have any experience with this situation?
We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly.
Question 1:
If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input.
The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals.
Question 2
Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex:
Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine.
Question 3:
These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
-
If the alternative is just de-indexing those duplicate pages on one website, then I'd definitely recommend the cross-domain canonicals, yes.
-
Brady:
Thanks for your advice. We are de-indexing as a test to see if our rankings are somehow being constrained because of duplicate content. Our rankings are not reacting as they should to our link building efforts, and I believe that duplicate content is the issue. With this test, I am trying to understand how Google sees and connects these 2 sites. If Google does connect the sites, then we must canonical since Google won't let us have 2 slots on page 1 for the same KW. If Google doesn't connect the sites, then we can theoretically get 2 listings on page 1 if our content is unique.
My hypothesis is that our massive duplicate content is having a negative impact on rankings. Google might be hitting us with a minor Panda slap, or the competing content is somehow hurting us algorithmicly. If we deindex a competing page, if we are being hurt by the algo, the remaining page should make a bump up.
I am pretty certain that canonicals will have a positive impact on our rankings. The question I am testing for is "do we have to canonical"? If we don't, then we have a decision to make - do we try to rank both sites for a KW, or canonical and focus on 1.
-
First of all, these are great questions.
My first question would be are the sites hosted on the same server or near-same IP address? If they are, and given much of the content is duplicate, chances are Google/search engines already understand these websites are somehow related. This is just something to consider...
Answer #1: If you de-index a group of products on one website, chances are, yes the other site would see some improvement just based on there being one less competitor. But I would do some extensive competitive research first to see if how the other sites are ranking next to your two sites.
Ultimately, I would side with a cross-domain canonical over de-indexing that way you're passing some value from one site to the other. I would do this on a product by product basis however, making sure the product niche you keep indexed matches with the site's overall niche theme and not vis versa.
Answer #2: My second paragraph sort of addresses your second question. Think from a semantic and topical understanding perspective here: if it's a healthcare product, make sure the site with the healthcare niche is the one being indexed, not just the general merchandise website. Even simply from a branding and general marketing perspective that makes more sense, IMO.
Answer #3: It sounds like, if there's duplicate descriptions (I'm guessing images, headers, and other content pieces) the canonicals likely would be honored. Even across domains, duplicate content can be a concern (especially if they're hosted together). Remember though, canonical tags are just a mere suggestion, so it could take some time for Google to honor it, but from the information you've given, I think that's your best shot.
Another thing to take into consideration when using canonical tag: make sure you're placing the canonical tag on the page/website that's performing worse of the two. There may be exceptions based on the niche and from a semantic perspective, but in general, you don't want to hurt your own performance by referencing the less authoritative page.
Good luck! Hope my advice helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Please provide solution for my website? Duplicate content Problem
I have 2 Domains with the same name with same content. How to solve that problem? Do I need to change the content from my main website. My Hosting is having different plans, but with the same features. So many pages were having the same content, and it is not possible to change the content, what is the solution for that? Please let me know how to solve that issue?
Intermediate & Advanced SEO | | Alexa.Hill0 -
Geographic site clones and duplicate content penalties
We sell wedding garters, niche I know! We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions). To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are... 1. Is this likely to stop me ranking the USA site? 2. Is this likely to harm my UK rankings? Any thoughts very welcome! Thanks. Mat
Intermediate & Advanced SEO | | mat20150 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Google WMT Showing Duplicate Content, But There is None
In the HTML improvements section of Google Webmaster Tools, it is showing duplicate content and I have verified that the duplicate content they are listing does not exist. I actually have another duplicate content issue I am baffled by, but that it already being discussed on another thread. These are the pages they are saying have duplicate META descriptions, http://www.hanneganremodeling.com/bathroom-remodeling.html (META from bathroom remodeling page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Bathroom Remodeling Washington DC, Bathroom Renovation Washington DC, Bath Remodel, Northern Virginia,DC, VA, Washington, Fairfax, Arlington, Virginia</a>" /> http://www.hanneganremodeling.com/estimate-request.html (META From estimate page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Free estimates basement remodeling, bathroom remodeling, home additions, renovations estimates, Washington DC area</a>" /> WlO9TLh
Intermediate & Advanced SEO | | WebbyNabler0 -
Reinforcing Rel Canonical? (Fixing Duplicate Content)
Hi Mozzers, We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint. Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up? Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
Intermediate & Advanced SEO | | Travis-W0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
How to seo : two domain having exactly the same content
Target website in question is ikt.co.id,it is hosted in our server located in US, what I am going to do right now is create a new subdomain id.ikt.co.id that served EXACTLY the same content but it is hosted in our Indonesia server. Whenever people go to any page within ikt.co.id, it will detect their country, if they are from Indonesia, I will redirect them to our Indonesia server. Okay, from SEO point of view I know there are couple problems such as Content duplication, and perhaps there are more. I think to handle the content duplication I can cannonical all URL on id.ikt.co.id to the ikt.co.id version instead. The same thing for social sharing, all links shared will be the one from ikt.co.id, so all of those link juices go to ikt.co.id, and for good measure I can also set the robots.txt to tell them not to index id.ikt.co.id ... All sounded good to me, until the I became paranoid and start thinking "have I missed anything that might hurt my SERP" ? here is the question, did i miss something important, if i did could you please tell me what it is and if possible brought the solution you think might work into this discussion?? Again thanks a lot for your help 😃
Intermediate & Advanced SEO | | IKT0