What's the best way to manage content that is shared on two sites and keep both sites in search results?
-
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
-
Does a duplicate content penalty impact specific pages or entire sites? If I wanted to test using the cross-domain canonical on a certain section of my site, would the impact be visible? Or would I need to put cross-domain canonicals on everything appearing on both sites in order to see the results?
-
Changing the articles or even page titles is not an option.
That's too bad. What Irving suggested has the potential for HUGE wins.
I'd find a way if that was my site.
-
Sure, that is a solution, but then rankings for the additional dupe sites went away because you basically suggested to Google "this URL on this site should not rank, because it is a copy of this article on this site, so give that site credit not me"
I believe that Jon has not been hit yet and wants both sites to rank, but is unable to change the content on either site to be unique. Any additional code you can insert in between the articles to create less similarity between both pages should help lessen the chance of getting hit but not a guarantee.
-
Irving, I had a client who had been hit with a manual penalty for Doorway Pages. They weren't Doorway Pages, they were just pages on various domains (that he owned) with a lot of duplicate content on them. We got him reinstated when we implemented cross-domain canonicals and filed a re-inclusion request. Sounds similar to this case?
Just wondering if anyone had heard of sites being hit like that for dupe content?
-
LOL true.
With all due respect, 301, noindex or cross-canonicalizing is as much of a solution as saying delete your second site. My suggestion of breaking up the content or appending additional content will possibly help you avoid a dupe content filter being triggered.
Duplicate content is not a penalty, it's a filter so the worst that happens is the main site that was bringing you the majority of traffic gets filtered and loses rankings to the secondary site.
I think a good question to ask at this point would be for you to clarify your first sentence: "I manage two sites that share some content" can you define what "some" means? are they main conversion pages or secondary blog posts, and what percentage of the site is dupe content?
BTW, hope you're not interlinking your two sites keep them as separate as possible.
-
Try this post for more info:
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Sounds like you don't need to manage the threat of duplicate content; you are producing the duplicate content yourself. You are instead wanting to minimize the effect duplicate content has from one site to the next. The only way I know of to get eliminate the risk of duplicate content penalties is to noindex, 301 redirect, or provide canonical URLs.
Since you want both sites to continue being indexed, you can either keep doing what you're doing (and hope you don't get hit) or use canonical URLs and pick which site is best for each page.
Hope this helps.
-
If I used the cross-domain canonical, would that mean that one site would stop appearing in search results?
-
You can append additional content to the bottom of the page on the more important site, or break up the article by adding content and or ads between the paragraphs (which will probably result in article fragmentation) but if you're not a news source it's not a big deal.
-
I'm no technical expert but it sounds like you're playing with fire. I've seen more than one site penalised for exactly this. If it looks like you're trying to rank the same piece of content twice, at least one of the URLs is at risk of filtering or a penalty. Isn't this exactly what the cross-domain canonical was created for?
-
Changing the articles or even page titles is not an option.
-
Paraphrase the articles on the highest traffic pages to your secondary site and/or tweak the keyword targets
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
What is the best way to take advantage of this keyword?
Hi SEO's! I've been checking out webmaster tools (screenshot attached) and noticed that we're getting loads of long tail searches around a search query 'arterial and venous leg ulcers' - on a side note we're a nursing organisation so excuse the content of the search!!! The trouble is that google is indexing a PDF page which we give out as a freebie:
Intermediate & Advanced SEO | | 9868john
http://www.nursesfornurses.com.au/admin/uploads/5DifferencesBetweenVenousAndArterialLegUlcers1.pdf This PDF is a couple of years old and needs updating but its got a few links pointing to it. Ok so down to the nitty gritty, we've just launched a blog:
http://news.nursesfornurses.com.au/Nursing-news/ We have a whole wound care category in which this content belongs, and i'm trying to find the best way to take advantage of the search, so I was thinking: Create an article of about 1000 words Update the PDF and re-upload it to the main domain (not the sub domain news.nursesfornurses.com.au) Attach the PDF to the article on the blog OR would it be better to host this on the blog, and setup a 301 redirect to this page? I just need some advice on how best to take advantage of this opportunity, our blog isn't getting much search traffic at the moment (despite having 300+ articles!!) and i'm looking into how we can change that. I look forward to your response and suggestions. Thanks! qtY64B10 -
Anyone managed to change 'At a glance:' in local search results
On Google's local search results, i.e when the 'Google places' data is displayed along with the map on the right hand side of the search results, there is also an element 'At a glance:'
Intermediate & Advanced SEO | | DeanAndrews
The data that if being displayed is from some years ago and the client would if possible like it to reflect there current services, which they have been providing for some five years. According to Google support here - http://support.google.com/maps/bin/answer.py?hl=en&answer=1344353 this cannot be changed, they say 'Can I edit a listing’s descriptive terms or suggest a new one?
No; the terms are not reviewed, curated, or edited. They come from an algorithm, and we do not help that algorithm figure it out. ' My question is has anyone successfully influenced this data and if so how.0 -
Starting Over with a new site - Do's and Don'ts?
After six months, we've decided to start over with a new website. Here's what I'm thinking. Please offer any constructive Do's or Don'ts if you see that I'm about to make a mistake. Our original site,(call it mysite.com ) we have come to the conclusion, is never going to make a come back on Google. It seems to us a better investment to start over, then to to simply keep hoping. Quite honestly, we're freakin' tired of trying to fix this. We don't want to screw with it any more. We are creative people, and would much rather be building a new race car rather than trying to overhaul the engine in the old one. We have the matching .net domain, mysite.net, which has been aged about 6 years with some fairly general content on a single page. There are zero links to mysite.net, and it was really only used by us for FTP traffic -- nothing in the SERPS for mysite.net. Mysite.NET will be a complete redesign. All content and images will be totally redone. Content will be new, excellent writing, unique, and targeted. Although the subject matter will be similar to mysite.COM, the content, descriptions, keywords, images -- all will be brand spankin' new. We will have a clean slate to begin the long painful link building process.We will put in the time, and bite the bullet until mysite.NET rules Google once again. We'll change the URL in all of our Adwords campaigns mysite.net. My questions are: 1. Mysite.com still gets some ok traffic from Bing. Can I leave mysite.com substantially intact, or does it need to go? 2. If I have "bad links" pointing to mysite.com/123.html what would happen if I 301 that page to mysite.NET/abc.html ? Does the "bad link juice" get passed on to the clean site? It would be a better experience for users who know our URL if they could be redirected to the new site. 3. Should we put Mysite.net on a different server in a different clean IP block? Or doesn't matter? We're willing to spend for the new server if it would help 4. What have I forgotten? Cheers, all
Intermediate & Advanced SEO | | DarrenX0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
I run an (unusual) clothing company. And I'm about to set up a version of our existing site for kids. Should I use a different domain? Or keep the current root domain?
Hello. I have a burning question which I have been trying to answer for a while. I keep getting conflicting answers and I could really do with your help. I currently run an animal fancy dress (onesie) company in the UK called Kigu through the domain www.kigu.co.uk. We're the exclusive distributor for a supplier of Japanese animal costumes and we've been selling directly through this domain for about 3 years. We rank well across most of our key words and get about 2000 hits each day. We're about to start selling a Kids range - miniature versions of the same costumes. We're planning on doing this through a different domain which is currently live - www.kigu-kids.co.uk. It' been live for about 3-4 weeks. The idea behind keeping them on separate domains is that it is a different target market and we could promote the Kids site separately without having to bring people through the adult site. We want to keep the adult site (or at least the homepage) relatively free from anything kiddy as we promote fancy dress events in nightclubs and at festivals for over 18s (don't worry, nothing kinky) and we wouldn't want to confuse that message. I've since been advised by an expert in the field that that we should set up a redirect from www.kigu-kids.co.uk and house the kids website under www.kigu.co.uk/kids as this will be better from an SEO perspective and if we don't we'll only be competing with ourselves. Are we making a big mistake by not using the same root domain for both thus getting the most of the link juice for the kids site? And if we do decide to switch to have the domain as www.kigu.co.uk/kids, is it a mistake to still promote the www.kigu-kids.co.uk (redirecting) as our domain online? Would these be wasted links? Or would we still see the benefit? Is it better to combine or is two websites better than one? Any help and advice would be much appreciated. Tom.
Intermediate & Advanced SEO | | KIGUCREW0