Multiple domain names with similar content
-
Hi, we've got multiple domains that point to the same website and same content. The only difference is the currency and some text, you could say only about 5% difference in each domain's content:
http://www.redwrappings.com.au/
http://www.redwrappings.com/Will Google penalise us for having 95% similar content for each domain (they sell the same products but in different currencies)?
We shoudn't really put canonical link, should we? Because 5% of the content is different, which means they are not identical. What would be the best solution if this is a problem?
Thanks
-
Hi Alan,
Thanks for that. We do want to boost the rank on one domain compared to other domains, so i think this is the way to go.
Cheers
-
Hi Graham,
In my experience cannonical content issues can arise with content that is only 50% similar. The strange thing is that in other cases I've had content that is 99.9% similar that hasn't been flagged by search engines.
There is a trick for identifiying the possiblity for duplicate content issues. It's outlined in this video series.
http://www.youtube.com/watch?v=l9AfEfTFgzM&feature=player_embedded
There are three parts to the video, and I must admit it is quite cheesy but the information is solid.
-
I don't believe it's a penalty, but seems a lot of people have this same question. Google: "There's no such thing as a "duplicate content penalty." At least, not in the way most people mean when they say that."
http://googlewebmastercentral.blogspot.com/2008/09/demystifying-duplicate-content-penalty.html
If I remember correct Rand Fishkin says in this webinar that it's gonna hurt, but you may want to watch it to be 100%! I remember it being sometime after 40-50% of the webinar I think. http://www.seomoz.org/dp/pro-webinar-october-2010-with-rand-fishkin
Good luck!
-
yes it will be seen as duplicate,
yes i would use rel=canonical so that the one site is getting ranked, but then your other sites wont rank so well.
you could 301 redirect them all to the same site then do somthing iun the code to present the 5% of changes to the correct users.
but then you have the pproblem of .com not ranking as well as .com.au in Australia.
if you want to keep the .com and the .com.au then you really need to make sites that are differentunless you can afford to maintain 2 vastly diffeernt sites, and can afford 2 SEO campains, i would 301 redirect to the one url. if you have links pointing to both domains then doing a 301 redirect will assign all link juice to one site, you may see a boost in the search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using canonical for duplicate contents outside of my domain
I have 2 domains for the same company, example.com and example.sg Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page. Any idea if this is the right decision Thanks
Technical SEO | | MohammadSabbagh0 -
Moving multiple domains into one domain
Hi, We're currently moving a group of websites (approximately 12) under one domain so we've moved from www.example.de , www.example.co.uk , www.example.com to www.example.com/de www.example.com/uk and so on. However I have read an article online today saying that this can lead to crawling complications. Has anyone done something similar and if there were any issues how did you overcome them? Many thanks
Technical SEO | | Creditsafe0 -
Multiple URLs and Dup Content
Hi there, I know many people might ask this kind of question, but nevertheless .... 🙂 In our CMS, one single URL (http://www.careers4women.de/news/artikel/206/) has been produced nearly 9000 times with strings like this: http://www.careers4women.de/news/artikel/206/$12203/$12204/$12204/ and this http://www.careers4women.de/news/artikel/206/$12203/$12204/$12205/ and so on and so on... Today, I wrote our IT-department to either a) delete the pages with the "strange" URLs or b) redirect them per 301 onto the "original" page. Do you think this was the best solution? What about implementing the rel=canonical on these pages? Right now, there is only the "original" page in the Google index, but who knows? And I don't want users on our site to see these URLs, so I thought deleting them (they exist only a few days!) would be the best answer... Do you agree or have other ideas if something like this happens next time? Thanx in advance...
Technical SEO | | accessKellyOCG0 -
Sub Domains
Hi,,, Okay we have 1 main site , a few years back we went down the road of sub domains and generated about 10. They have page rank and age but we wish to move them back to the main web site. What is the correct or best way to achieve this. 1 copy all content to the main web site creating dup pages and then use a redirects from the sub pages to the new dup pages on the main domain... or 2 write new content on the main domain for the subdomain pages and redirect to the new content. Problem with 2 is the amount of work involved...
Technical SEO | | NotThatFast0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
A lot of product pages with very similar content
I'm working with someone who's setting up an online jewelry store. The jewelry is available in many metal types, so we're creating filters to provide a good user experience in trying to narrow down their choice. Let's take an example of a wedding ring that's available these options: 10kt yellow gold
Technical SEO | | Leighm
10kt white gold
18kt yellow gold
18kt white gold
Palladium
Platinum These are all entered as separate products, so that they can be used in the filtering system. However, apart from some minor changes to the title and description most of the content will be identical, across these 6 product pages. Also, many wedding ring styles are going to be very similar, so we're going to have very similar descriptions for a lot of the rings. We're concerned about problems this might cause with the search engines in terms of duplicate content. There's 2 issues that I an see (there may be more!): They will not index many of the pages and we'll leak link juice to those pages that will never get indexed They do index all the variations, but the content is so similar, that we have different pages competing for essentially the same keywords Also, these products are likely to come and go, so investing heavily on creating really unique content for them isn't really sustainable, affordable. Any advise? Thanks,0 -
Redirecting an Old Domain
One of my clients has a newish e-commerce website that was just redesigned. Part of this new marketing push is shutting down an old yahoo store. The problem is that this old store's domain has a 10 year old link in DMoz and is there fore in about 200 other directories. Is pointing that old domain at the new website going to be enough to keep all of that link juice flowing?
Technical SEO | | Simple_Machines0 -
Registering a domain for multiple years
Does registering or renewing a domain name for more than a year improve search as a result of a more trusted site and company that will seem to be around for longer than just a year?
Technical SEO | | Motava0