Duplicate content issues with australian and us version of website
-
Good afternoon.
I've tried searching for an answer to the following question but I believe my circumstance is a little different than what has been asked in the past.
I currently run a Australian website targeted at a specific demographic (50-75) and we produce a LARGE number of articles on a wide variety of lifestyle segments. All of our focus up until now has been in Australia and our SEO and language is dedicated towards this.
The next logical step in my mind is to launch a mirror website targeted at the US market. This website would be a simple mirror of a large number of articles (1000+) on subjects such as Food, Health, Travel, Money and Technology.
Our current CMS has no problems in duplicating the specific items over and sharing everything, the problem is in the fact that we currently use a .com.au domain and the .com domain in unavailable and not for sale, which would mean we have to create a new name for the US targeted domain.
The question is, how will mirroring this information, targeted towards US, affect us on Google and would we better off getting a large number of these articles 're-written' by a company on freelancer.com etc?
Thanks,
Drew -
Hi Drew,
In general, we'd say to use one English language domain to target all English speaking regions, but since you have a .com.au domain, you won't be able to use this to effectively rank in the US.
It is possible to mirror a domain and geo-target each so that they rank well in their intended locations. You would set the geo-target for the US site to the US in Webmaster Tools (the Australian site is automatically geotargeted because of its ccTLD). Google reps have stated in the past that this is effective and will eliminate the duplicate content problem.
Of course, Google isn't perfect and sometimes this goes wrong. We had a client with two identical websites--one for the UK and one for the US--whose UK site always showed up in the US and whose US site ranked nowhere in either territory. That's very uncommon, however.
If you do this--mirror the site and properly geotarget in Webmaster Tools under Site Configuration -> Settings--neither version of the site should find itself removed from Google due to duplicate content.
Keep a very close eye on each site's indexation, ranking and traffic though. Like I say, this method isn't perfect as Google has gotten it wrong in the past.
Cheers,
Jane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with 80 websites and duplicated content
Consider the following: A client of ours has a Job boards website. They then have 80 domains all in different job sectors. They pull in the jobs based on the sectors they were tagged in on the back end. Everything is identical across these websites apart from the brand name and some content. whats the best way to deal with this?
Technical SEO | | jasondexter0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
Duplicate Content Vs No Content
Hello! A question that has been throw around a lot at our company has been "Is duplicate content better than no content?". We operate a range of online flash game sites, most of which pull their games from a feed, which includes the game description. We have unique content written on the home page of the website, but aside from that, the game descriptions are the only text content on the website. We have been hit by both Panda and Penguin, and are in the process of trying to recover from both. In this effort we are trying to decide whether to remove or keep the game descriptions. I figured the best way to settle the issue would be to ask here. I understand the best solution would be to replace the descriptions with unique content, however, that is a massive task when you've got thousands of games. So if you have to choose between duplicate or no content, which is better for SEO? Thanks!
Technical SEO | | Ryan_Phillips0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0 -
Duplicate content issues caused by our CMS
Hello fellow mozzers, Our in-house CMS - which is usually good for SEO purposes as it allows all the control over directories, filenames, browser titles etc that prevent unwieldy / meaningless URLs and generic title tags - seems to have got itself into a bit of a tiz when it comes to one of our clients. We have tried solving the problem to no avail, so I thought I'd throw it open and see if anyone has a soultion, or whether it's just a fault in our CMS. Basically, the SEs are indexing two identical pages, one ending with a / and the other ending /index.php, for one of our sites (www.signature-care-homes.co.uk). We have gone through the site and made sure the links all point to just one of these, and have done the same for off-site links, but there is still the duplicate content issue of both versions getting indexed. We also set up an htaccess file to redirect to the chosen version, but to no avail, and we're not sure canonical will work for this issue as / pages should redirect to /index.php anyway - and that's we can't work out. We have set the access file to point to index.php, and that should be what should be happening anyway, but it isn't. Is there an alternative way of telling the SE's to only look at one of these two versions? Also, we are currently rewriting the content and changing the structure - will this change the situation we find ourselves in?
Technical SEO | | themegroup0