What's the best way to manage content that is shared on two sites and keep both sites in search results?
-
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
-
Does a duplicate content penalty impact specific pages or entire sites? If I wanted to test using the cross-domain canonical on a certain section of my site, would the impact be visible? Or would I need to put cross-domain canonicals on everything appearing on both sites in order to see the results?
-
Changing the articles or even page titles is not an option.
That's too bad. What Irving suggested has the potential for HUGE wins.
I'd find a way if that was my site.
-
Sure, that is a solution, but then rankings for the additional dupe sites went away because you basically suggested to Google "this URL on this site should not rank, because it is a copy of this article on this site, so give that site credit not me"
I believe that Jon has not been hit yet and wants both sites to rank, but is unable to change the content on either site to be unique. Any additional code you can insert in between the articles to create less similarity between both pages should help lessen the chance of getting hit but not a guarantee.
-
Irving, I had a client who had been hit with a manual penalty for Doorway Pages. They weren't Doorway Pages, they were just pages on various domains (that he owned) with a lot of duplicate content on them. We got him reinstated when we implemented cross-domain canonicals and filed a re-inclusion request. Sounds similar to this case?
Just wondering if anyone had heard of sites being hit like that for dupe content?
-
LOL true.
With all due respect, 301, noindex or cross-canonicalizing is as much of a solution as saying delete your second site. My suggestion of breaking up the content or appending additional content will possibly help you avoid a dupe content filter being triggered.
Duplicate content is not a penalty, it's a filter so the worst that happens is the main site that was bringing you the majority of traffic gets filtered and loses rankings to the secondary site.
I think a good question to ask at this point would be for you to clarify your first sentence: "I manage two sites that share some content" can you define what "some" means? are they main conversion pages or secondary blog posts, and what percentage of the site is dupe content?
BTW, hope you're not interlinking your two sites keep them as separate as possible.
-
Try this post for more info:
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Sounds like you don't need to manage the threat of duplicate content; you are producing the duplicate content yourself. You are instead wanting to minimize the effect duplicate content has from one site to the next. The only way I know of to get eliminate the risk of duplicate content penalties is to noindex, 301 redirect, or provide canonical URLs.
Since you want both sites to continue being indexed, you can either keep doing what you're doing (and hope you don't get hit) or use canonical URLs and pick which site is best for each page.
Hope this helps.
-
If I used the cross-domain canonical, would that mean that one site would stop appearing in search results?
-
You can append additional content to the bottom of the page on the more important site, or break up the article by adding content and or ads between the paragraphs (which will probably result in article fragmentation) but if you're not a news source it's not a big deal.
-
I'm no technical expert but it sounds like you're playing with fire. I've seen more than one site penalised for exactly this. If it looks like you're trying to rank the same piece of content twice, at least one of the URLs is at risk of filtering or a penalty. Isn't this exactly what the cross-domain canonical was created for?
-
Changing the articles or even page titles is not an option.
-
Paraphrase the articles on the highest traffic pages to your secondary site and/or tweak the keyword targets
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Creating a site search engine while keeping SEO factors in mind
I run and own my own travel photography business. (www.mickeyshannon.com) I've been looking into building a search archive of photos that don't necessarily need to be in the main galleries, as a lot of older photos are starting to really clutter up and take away the emphasis from the better work. However, I still want to keep these older photos around. My plan is to simplify my galleries, and pull out 50-75% of the lesser/older photos. All of these photos will still be reachable by a custom-build simple search engine that I'm building to house all these older photos. The photos will be searchable based on keywords that I attach to each photo as I add them to my website. The question I have is whether this will harm me for having duplicate content? Some of the keywords that would be used in the search archive would be similar or the same to the main gallery names. However, I'm also really trying to push my newer and better images out there to the front. I've read some articles that talk about noindexing search keyword results, but that would make it really difficult for search engines to even find the older photos, as searching for their keywords would be the only way to find them. Any thoughts on a way to work this out that benefits, or at least doesn't hurt me, SEO-wise?
Intermediate & Advanced SEO | | msphotography0 -
My company wants to set up some blogs - what's best practice in getting started from scratch?
My company wants to set up two or three blogs (on previously unused domains) with the idea being to disseminate good content that gets picked up in SERPs and acts as a lead generator, shows us to be authorities in our market, creates brand (or individual employee who's doing the blogging) awareness etc... From scratch, what are all the boxes that should be ticked to make this work from the outset? What are the must haves?With all the ideals in place, how long could it realistically take to make this work? What are some pitfalls to look out for? Any advice in general will be appreciated. Thanks, M
Intermediate & Advanced SEO | | Martin_S0 -
Something unintelligible in google search results.
In the last month i see something that i can't explain, from nowhere the first search result for the keyword Name Necklace is an Etsy shop, (I attach an image). I try to find out why google gives this online shop the first position and i have no idea why is that. Please, any suggestions? mnbAIrY.png
Intermediate & Advanced SEO | | Tiedemann_Anselm0 -
Should product searches (on site searches) be noindex?
We have a large new site that is suffering from a sitewide panda like penalty. The site has 200k pages indexed by Google. Lots of category and sub category page content and about 25% of the product pages have unique content hand written (vs the other pages using copied content). So it seems our site is labeled as thin. I'm wondering about using noindex paramaters for the internal site search. We have a canonical tag on search results pointing to domain.com/search/ (client thought that would help) but I'm wondering if we need to just no index all the product search results. Thoughts?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Pay on Organic Search Results
Are there companies out there that accept payment on the results they get for organic search listings? I have a site that I want to be number 1 for two terms on Google UK and .com and I while I dont think it will take much effort I would like to find a decent SEO company or person that can do this and be paid for the result. What do people think?
Intermediate & Advanced SEO | | clayts0 -
What's the best practise for adding a blog to your site post panda? subdomain or subdirectory???
Should i use a subdomain or a subdirectory? i was going to use a subdirectory however i have been reading a lot of articles on the use of subdomains post panda and the advantages of using them instead of using subdirectories. Thanks Ari
Intermediate & Advanced SEO | | dublinbet0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770