Duplicate Content across 4 domains
-
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical.
There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain.
I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain.
What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc.
Or would it better to use cross-domain canoncial tags?
Thanks
-
EGOL - thanks for advice. Yes there are lots of ilinks between the domains. Although I also have clients who have done this deliberately for perceived gain, I think in this case the client has made an honest mistake by simply applying their CMS (with relative links) to each of the domains they thought they should purchase. It's confused a bit further by one section using https with an absolute domain so users can end up migrating from one domain to another and from http to https!
As an SEO I also have inclination to 301 page-by-page to best ranking site. However, as I mentioned to Thomas (above) I think the client will probably want to go with their preferred domain and as such I'll 301 page-by-page to that one. I'll discuss with the client and post the outcome.
-
Thomas - thanks for your reply. All the domains are very similar - variations from the same core concept. I have a feeling that the client will want to standardise on their preferred domain and as you say it is the easiest to remember.
-
Lots of businesses build three sites because they think it is a good way to kickass on their competitors. It can be really hard or even impossible get get them to give up on that idea.
I am going to guess that you have a second issue... they have links on all of these sites point to each other because they think it will help their pagerank and SERP position. Really hard to talk people out of this practice too.
If I owned these five sites I would shut down four of them and redirect all of them page-by-page to my best ranking site... and it would become my preferred domain. There is something else that you might try to talk this client out of.
Good luck!
-
Wow... so many options. It is really hard to give a good answer without knowing what the domains are and the differences in the domain names themselves, but here are a few thoughts.
If the domains provide some distinct differences, whether that be target audience (grandmaroadbikes.com) or branding (ZippzRoadBikes.com) then I would start with the cross domain canonical tags. There are many sites that stand on their own while using cross-domain canonicals and maintaining their own brand.
How similar are the domain names? If the domains are too similar, and unique branding can not really be achieved, then I would consider the consolidation. My gut reaction as an advertiser is to stick with the easiest to say and type. I assume the domain being marketed offline wins that prize. I would focus on promoting that domain. Further, many people will loose a bit of trust if they type in www.AdvertisedDomain.com and get www.DomainUnAdvertised.com. Plus you don't want to cause brand confusion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
4 websites with same content?
I have 4 websites (1 Main, 3 duplicate) with same content. Now I want to change the content for duplicate websites and main website will remain the same content. Is there any problem with my thinking?
Intermediate & Advanced SEO | | marknorman0 -
Sub domain on root domain
Hello,
Intermediate & Advanced SEO | | dror999
I have a question that I can't find a good answer on.
I have a site, actually a "portal"/ "directory" for service providers.
Now, for start, we opened every service provider own page on our site, but now we get a lot of applications from those providers that thy want sites from their own.
We want to make every service provider his own site, but on sub domain url. ( they don’t mind… its ok for them)
So, my site is www.exaple.com
There site will be: provider.exaple.com
Now I have two questions:
1. can it harm my site in SEO?
2. if one from those sub domain , punished by google because is owner do "black hat seo" , how it will affect the rood domin? It can make the root domain to get punished?
Thanks!!0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0