Two sites, two domains, two brands, 98% same content
-
There are two affiliated brick & mortar retail stores moving into e-commerce. For non-marketing reasons separate e-commerce websites are desired.
The two brands are based in separate (nearby) cities in the same Canadian province.
Although the store name and branding will be different, the content on the site will either be near duplicates or exact duplicates.
The more I look into this on Google and SEOmoz QA, the more I am concerned about the SEO implications of this.
SEOmoz QA: Multiple cities/regions websites - duplicate content?
"So, yes, because you are offering the same services at second location, you are thinking correctly about the need to rewrite all content so it's not a duplicate of site #1."
Duplicate content - Webmaster Tools Help
"However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic… In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.
...
Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results."
Unfortunately, I would say there's very little chance that rewritten content will happen in the foreseeable future.
With that said, I'd be greatly appreciative of the concerns and remedies that the SEOmoz community has to offer (even if they're for future use). Thanks in advance.
-
If you had a legitimate purpose you could try using encoding but I'm not sure how this falls within Googles guidelines you would need to check. From experience with similar issues I've found that anything up to about 60% duplicate will rank.
-
You won't be able to have both sites ranking in Google if you've got duplicate content. One of them will be flagged with Panda and will plummet to page 10 or lower on the SERPS.
Now, if you don't necessarily need people to find one of these sites via Google, you can still have duplicate content. It would be ideal to have the home pages be unique. Then, for the inner pages, use a rel-canonical tag to tell Google which of the websites should be included in the index for these two duplicate pages.
The other option is to apply a noindex meta tag to the duplicated pages on one of the sites. However, the canonical option is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content across different domains
Hi Guys, Looking for some advice regarding duplicate content across different domains. I have reviewed some previous Q&A on this topic e.g. https://moz.com/community/q/two-different-domains-exact-same-content but just want to confirm if I'm missing anything. Basically, we have a client which has 1 site (call this site A) which has solids rankings. They have decided to build a new site (site B), which contains 50% duplicate pages and content from site A. Our recommendation to them was to make the content on site B as unique as possible but they want to launch asap, so not enough time. They will eventually transfer over to unique content on the website but in the short-term, it will be duplicate content. John Mueller from Google has said several times that there is no duplicate content penalty. So assuming this is correct site A should be fine, no ranking losses. Any disagree with this? Assuming we don't want to leave this to chance or assume John Mueller is correct would the next best thing to do is setup rel canonical tags between site A and site B on the pages with duplicate content? Then once we have unique content ready, execute that content on the site and remove the canonical tags. Any suggestions or advice would be very much appreciated! Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Purchasing an existing domain + redirecting to company's domain
Let's pretend that competitor.com ranks well for certain search terms and generates some traffic from organic search. If a company were to acquire the competitor (or their domain), what's the smartest way to redirect that SEO value to the acquiring company's website? Does a 301 redirect work between different root domains? Even if it does work, is that the smartest approach? Thanks for your help!
Intermediate & Advanced SEO | | Raleigh0 -
Publishing content in two or more places?
I've been thinking about publishing an article on LinkedIn and then posting the same article to the news page on the website. It would be high quality informative and useful but is that likely to cause any duplicate content issues?
Intermediate & Advanced SEO | | seoman100 -
Should we host our magazine on a subdomain of E-com site or its own domain?
We host a online fashion magazine on a subdomain of our e-commerce site. Currently we host the blog which is word press based on a subdomain ex: stylemag.xxxxxxx.com First question is are all the links from our blog considered internal links? They do not show in the back links profile. Also would it be better to host this on its own domain? Second question Is my main URL getting credit for the unique content published to the blog on the subdomain and if so is it helping the overall SEO of my website more then if it and the links were hosted on its own wordpress.com
Intermediate & Advanced SEO | | kushvision0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Brand new domain with a lot of old links.
So I just bought http://penny-auctions.co/ a couple of days ago, ignore the fact it's not set up yet and generally just a mess, and something kept bugging me about why the domain authority kept showing as 25 in the SEOmoz toolbar. Now initially I'd set it up as www. and with the trailing slash so the PA was 1, however after a bit of exploration it seems that someone has been building links to the non-www version of the domain for at least a year! The site has never been owned before so I've now made the non-www version the default and have a 5 day old site with PA 35 and DA 25! SEOmoz shows PA - 45 links, 20 domains and DA - 341 links, 113 domains. Majestic Historic - 8598 links, 236 domains and Fresh - 160 links, 62 domains. Brilliant! Except they're all spamtastic. What do you think this will do for my future attempts at ranking? Should I create pages that have links to them or just 301 them?
Intermediate & Advanced SEO | | StalkerB0 -
Steps you can take to ensure your content is indexed and registered to your site before a scraper gets to it?
Hi, A clients site has significant amounts of original content that has blatantly been copied and pasted in various other competitor and article sites. I'm working with the client to rejig lots of this content and to publish new content. What steps would you recommend to undertake when the new, updated site is launched to ensure Google clearly attributes the content to the clients site first? One thing I will be doing is submitting a new xml + html sitemap. Thankyou
Intermediate & Advanced SEO | | Qasim_IMG0