What's the best way to manage content that is shared on two sites and keep both sites in search results?
-
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
-
Does a duplicate content penalty impact specific pages or entire sites? If I wanted to test using the cross-domain canonical on a certain section of my site, would the impact be visible? Or would I need to put cross-domain canonicals on everything appearing on both sites in order to see the results?
-
Changing the articles or even page titles is not an option.
That's too bad. What Irving suggested has the potential for HUGE wins.
I'd find a way if that was my site.
-
Sure, that is a solution, but then rankings for the additional dupe sites went away because you basically suggested to Google "this URL on this site should not rank, because it is a copy of this article on this site, so give that site credit not me"
I believe that Jon has not been hit yet and wants both sites to rank, but is unable to change the content on either site to be unique. Any additional code you can insert in between the articles to create less similarity between both pages should help lessen the chance of getting hit but not a guarantee.
-
Irving, I had a client who had been hit with a manual penalty for Doorway Pages. They weren't Doorway Pages, they were just pages on various domains (that he owned) with a lot of duplicate content on them. We got him reinstated when we implemented cross-domain canonicals and filed a re-inclusion request. Sounds similar to this case?
Just wondering if anyone had heard of sites being hit like that for dupe content?
-
LOL true.
With all due respect, 301, noindex or cross-canonicalizing is as much of a solution as saying delete your second site. My suggestion of breaking up the content or appending additional content will possibly help you avoid a dupe content filter being triggered.
Duplicate content is not a penalty, it's a filter so the worst that happens is the main site that was bringing you the majority of traffic gets filtered and loses rankings to the secondary site.
I think a good question to ask at this point would be for you to clarify your first sentence: "I manage two sites that share some content" can you define what "some" means? are they main conversion pages or secondary blog posts, and what percentage of the site is dupe content?
BTW, hope you're not interlinking your two sites keep them as separate as possible.
-
Try this post for more info:
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Sounds like you don't need to manage the threat of duplicate content; you are producing the duplicate content yourself. You are instead wanting to minimize the effect duplicate content has from one site to the next. The only way I know of to get eliminate the risk of duplicate content penalties is to noindex, 301 redirect, or provide canonical URLs.
Since you want both sites to continue being indexed, you can either keep doing what you're doing (and hope you don't get hit) or use canonical URLs and pick which site is best for each page.
Hope this helps.
-
If I used the cross-domain canonical, would that mean that one site would stop appearing in search results?
-
You can append additional content to the bottom of the page on the more important site, or break up the article by adding content and or ads between the paragraphs (which will probably result in article fragmentation) but if you're not a news source it's not a big deal.
-
I'm no technical expert but it sounds like you're playing with fire. I've seen more than one site penalised for exactly this. If it looks like you're trying to rank the same piece of content twice, at least one of the URLs is at risk of filtering or a penalty. Isn't this exactly what the cross-domain canonical was created for?
-
Changing the articles or even page titles is not an option.
-
Paraphrase the articles on the highest traffic pages to your secondary site and/or tweak the keyword targets
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How did these sites get two organic listings?
Hi Guys, If you type the keyword "car seat covers" on Google Australia. You will see one site screenshot below: https://image.prntscr.com/image/lgfcK6DmSSGRo3Jx06yWag.png With double listing and then a site below that with another double listing see: https://image.prntscr.com/image/4yJfPzRjR5mPaQb4rr9l-Q.png Does anyone know why Google is giving both of these double listings, is it something to do with their internal linking? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Syndicated content with meta robots 'noindex, nofollow': safe?
Hello, I manage, with a dedicated team, the development of a big news portal, with thousands of unique articles. To expand our audiences, we syndicate content to a number of partner websites. They can publish some of our articles, as long as (1) they put a rel=canonical in their duplicated article, pointing to our original article OR (2) they put a meta robots 'noindex, follow' in their duplicated article + a dofollow link to our original article. A new prospect, to partner with with us, wants to follow a different path: republish the articles with a meta robots 'noindex, nofollow' in each duplicated article + a dofollow link to our original article. This is because he doesn't want to pass pagerank/link authority to our website (as it is not explicitly included in the contract). In terms of visibility we'd have some advantages with this partnership (even without link authority to our site) so I would accept. My question is: considering that the partner website is much authoritative than ours, could this approach damage in some way the ranking of our articles? I know that the duplicated articles published on the partner website wouldn't be indexed (because of the meta robots noindex, nofollow). But Google crawler could still reach them. And, since they have no rel=canonical and the link to our original article wouldn't be followed, I don't know if this may cause confusion about the original source of the articles. In your opinion, is this approach safe from an SEO point of view? Do we have to take some measures to protect our content? Hope I explained myself well, any help would be very appreciated, Thank you,
Intermediate & Advanced SEO | | Fabio80
Fab0 -
2 eCommerce stores that are identical 1 for US 1 for CA, what's the best way to SEO?
Hello everyone! I have an SEO question that I cannot solve given the parameters of the project, and I was wondering if someone could provide me with the next best alternative to my situation. Thank you in advance. The problem: Two eCommerce stores are completely identical (structure, products, descriptions, content) but they are on separate domains for currency and targeting purposes. www.website-can.com is for Canada and www.website-usa.com is for US. Due to exchange rate issues, we are unable to combine the 2 domains into 1 store and optimize. What's been done? I have optimized the Canadian store with unique meta titles and descriptions for every page and every product. However I have left the US store untouched. I would like to gain more visibility for the US Store but it is very difficult to create unique content considering the products are identical. I have evaluated using canonicals but that would ask Google to only look at either the Canadian or US store, , correct me if i'm wrong. I am looking for the next best solution given the challenges and I was wondering if someone could provide me with some ideas.
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
One site two languages - what to do with urls?
Hi, We are working with a client who has a Spanish site which is in English and Spanish, what is the best url structure to go for? www.domain.es and en.domain.es or www.domain.es and www.domain.es/en or none of the above?
Intermediate & Advanced SEO | | J_Sinclair0 -
How to get a site out of Google's Sandbox
Hi I am working on a website that is ranking well in bing for the domain name / exact url search but appears no where in Google or Yahoo. I have done the site search in Google and it is indexed so I am presuming it is in the sandbox. The website was originally developed in India and I do not know whether it had some history of bad backlinks. The website itself is well optimised and I have checked all pages in Moz - getting a grade A. Webmaster Tools is not showing any manual actions - I was wondering what I could do next?
Intermediate & Advanced SEO | | AllieMc0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Places Listing in Search Results
Hi everyone, We have a company that hired us to set-up their Google Places listing for their 2nd location. The listing for the 1st location is very strong. Lots of reviews, Zagat rating, Knowledge graph, etc. In the search results the Google Places listing for the 1st location has merged with the website listing. You can see the link to the main site w/ a small grey google places listing directly below it. The client would like BOTH Google places listing to show up in the search results. They both show up on the map listing but not in the search results. Each location has its own listing in Google Places. We have also created different pages on the website for each location. Is there a way to get the search results to display places listings? I have noticed a few other business have done it by naming each of their multiple locations something slightly different. Then the search results seem to realize there are multiple locations and display the places listing in the search results. Anyone run into this? Any ideas? Thanks!!
Intermediate & Advanced SEO | | SeattleJoe0