What's the best way to manage content that is shared on two sites and keep both sites in search results?
-
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
-
Does a duplicate content penalty impact specific pages or entire sites? If I wanted to test using the cross-domain canonical on a certain section of my site, would the impact be visible? Or would I need to put cross-domain canonicals on everything appearing on both sites in order to see the results?
-
Changing the articles or even page titles is not an option.
That's too bad. What Irving suggested has the potential for HUGE wins.
I'd find a way if that was my site.
-
Sure, that is a solution, but then rankings for the additional dupe sites went away because you basically suggested to Google "this URL on this site should not rank, because it is a copy of this article on this site, so give that site credit not me"
I believe that Jon has not been hit yet and wants both sites to rank, but is unable to change the content on either site to be unique. Any additional code you can insert in between the articles to create less similarity between both pages should help lessen the chance of getting hit but not a guarantee.
-
Irving, I had a client who had been hit with a manual penalty for Doorway Pages. They weren't Doorway Pages, they were just pages on various domains (that he owned) with a lot of duplicate content on them. We got him reinstated when we implemented cross-domain canonicals and filed a re-inclusion request. Sounds similar to this case?
Just wondering if anyone had heard of sites being hit like that for dupe content?
-
LOL true.
With all due respect, 301, noindex or cross-canonicalizing is as much of a solution as saying delete your second site. My suggestion of breaking up the content or appending additional content will possibly help you avoid a dupe content filter being triggered.
Duplicate content is not a penalty, it's a filter so the worst that happens is the main site that was bringing you the majority of traffic gets filtered and loses rankings to the secondary site.
I think a good question to ask at this point would be for you to clarify your first sentence: "I manage two sites that share some content" can you define what "some" means? are they main conversion pages or secondary blog posts, and what percentage of the site is dupe content?
BTW, hope you're not interlinking your two sites
keep them as separate as possible.
-
Try this post for more info:
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Sounds like you don't need to manage the threat of duplicate content; you are producing the duplicate content yourself. You are instead wanting to minimize the effect duplicate content has from one site to the next. The only way I know of to get eliminate the risk of duplicate content penalties is to noindex, 301 redirect, or provide canonical URLs.
Since you want both sites to continue being indexed, you can either keep doing what you're doing (and hope you don't get hit) or use canonical URLs and pick which site is best for each page.
Hope this helps.
-
If I used the cross-domain canonical, would that mean that one site would stop appearing in search results?
-
You can append additional content to the bottom of the page on the more important site, or break up the article by adding content and or ads between the paragraphs (which will probably result in article fragmentation) but if you're not a news source it's not a big deal.
-
I'm no technical expert but it sounds like you're playing with fire. I've seen more than one site penalised for exactly this. If it looks like you're trying to rank the same piece of content twice, at least one of the URLs is at risk of filtering or a penalty. Isn't this exactly what the cross-domain canonical was created for?
-
Changing the articles or even page titles is not an option.
-
Paraphrase the articles on the highest traffic pages to your secondary site and/or tweak the keyword targets
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Site migration/ CMS/domain site structure change-no access to search console
Hi everyone, We are migrating an old site under a bigger umbrella (our main domain). As mentioned in the title, We'll perform CMS migration, domain change, and site structure change. Now, the major problem is that we can't get into google search console for the old site. The site still has old GA code, so google search console verification using this method is not possible, also there is no way developers will be able to add GTM or edit DNS setting (not to bother you with the reason why). Now, my dilemma is : 1. Do we need access to old search console to notify Google about the domain name change or this could be done from our main site (old site will become a part of) search console 2. We are setting up 301 redirects from old to the new domain (not perfect 1:1 redirect ). Once migration is done does anything else needs to be done with the old domain (it will become obsolete)? 3.The main site, Site-map... Should I create a new sitemap with newly added pages or update the current one. 4. if you have anything else please add:) Thank you!
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
Title Tag Verses H1 Tag. Is having both the same better than different if there's only one clear winner in keyword search volume
Hi Mozzers, I am going through my categories on my eccomerce hire site trying to improve things and just wanted to check this query with you My understanding is that if I have the same H1 and title tag, then that would give more weight for that keyword phrase? Would I also be correct in assuming that the H1 is more important than the title tag or should both be treated as equals in terms of SEO. My dimemla is that for certain products we hire, there's only really one clear winner in terms of keyword phrase. The others I find in keyword planner are way down the volume list , so I have tended to put the H1 and title tag as the same and then have H2 tag and a slightly different heading. Is that the best philosphy or should I really mix them up , so the the title tag, h1, h2 are different ? Also Currently My on page content mentions the the H1 tag near the beginning of the content. Is this correct or should I really be using the H2 tag phrase near the beginning of the content. For example - One of the products we hire out is carpet cleaners. Therefore the main keyword phrase is carpet cleaner hire
Intermediate & Advanced SEO | | PeteC12
and for our local pages its' carpet cleaner hire <city name="">.
This is my title tag and H1 tag and then for my h2 tag , I have something like "carpet cleaning equipment" with the content
mentioning carpet cleaner hire near the beginning.</city> I don't want to look likes its over optimization or mention the word hire to much but being a hire website, it's difficult not to and other keywords that don't mention it in it, are to varied so could increase bounce rates ?. When I look in GWT against my content keywords - the word hire shows a full bar. Just wondered what peoples thoughts are if what I am doing it okay?
thanks
Pete0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Google's serp
Hello Guys ! I will appreciate if you will share your thoughts re the situation i have. The homepage for one of my sites is one last page of google's serp, although internal pages are displayed in the top 10. 1. Why ?
Intermediate & Advanced SEO | | Webdeal
2. What should I do to correct the situation with the homepage ? regards0 -
About to launch a new e-commerce site need help with anchor text's
Hey guys, I need some advice regarding the keywords I want to target for a new website. The website is e-commerce and is about 60% done but i wanted some advice. home page - http://www.diamondengagement.com/ For link building purposes I only want to target the domain name for the anchor text "diamond engagement" and "diamondengagement.com" for internal pages like .... http://www.diamondengagement.com/engagement-rings/ I want this page to rank for "engagement rings" but that's a very competitive keyword. I was thinking for the first 3 or 4 months I only build links to this page with the domain name again "diamond engagement" and phrase anchor text's using "diamond engagement" than just out right building "engagement rings" or "engagement ring" to start off. What are your thoughts??
Intermediate & Advanced SEO | | harrykabadaian0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0