Duplicate Content Penalties, International Sites
-
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
-
The consensus is that even though the content is the same, that it will rank locally using country specific domains. Can anyone provide examples where this is currently working?
-
I use rackspace | cloud sites. Is there a way I can request to have a domain pushed to a pool you have in UK or CA for example?
-
This Video from Matt Cutts will help too http://www.youtube.com/watch?v=Ets7nHOV1Yo
-
I asked this exact question to Greg Grothaus from Google at a conference back in 2009, and his answer was that duplicated content across different TLDs should'nt be something to be too concerned about. Realistically, search engines will decide which version of the site is more relevant for a particular geographic audience.
-
When it comes to English... I just advice that there are ways to make "different" a content. Just think to how different Brits and Americans write many words. Then all the classic International SEO tactics (links for the country your site have to rank, IP, address...)
Apart that, if you have the international sites with their corresponding Tld (.co.uk, .au, .in...) and you specify that the .com is for the USA Google, actually Google is quite good in noticing what site should have to rank for any country.
-
Yes. Translated content will not be considered a penalty as long as long as you launch the site on a domain with proper local TLD and add locally targeted content then you should be ok. Additionally, you may want to consider hosting the website with a local hosting provider.
This should also apply to an English language content modified for UK audience since UK English is technically considered different than the US. We have multiple English language international websites hosted on local TLDs that rank locally for the respective keywords.
Google has become much smarter in terms of detecting the geo local elements and it should serve the appropriate site on the SERP without causing duplicate content issues.
-
I think this sort of duplicate content is something that Google sees often. If you are copying everything exactly between domains I’d question if you need multiple sites. Presuming your content has country specific differences you’ll be ok.
Do not forget to register in Google webmaster tools your target market for each URL. Maybe build some new links in each local at the time of launch (press mentions, twitter shout outs etc).
Also you may want to consider the approach taken by Microsoft. One domain with country specific folders e.g.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
How to prevent duplicate content at a calendar page
Hi, I've a calender page which changes every day. The main url is
Technical SEO | | GeorgFranz
/calendar For every day, there is another url: /calendar/2012/09/12
/calendar/2012/09/13
/calendar/2012/09/14 So, if the 13th september arrives, the content of the page
/calendar/2012/09/13
will be shown at
/calendar So, it's duplicate content. What to do in this situation? a) Redirect from /calendar to /calendar/2012/09/13 with 301? (but the redirect changes the day after to /calendar/2012/09/14) b) Redirect from /calendar to /calendar/2012/09/13 with 302 (but I will loose the link juice of /calendar?) c) Add a canonical tag at /calendar (which leads to /calendar/2012/09/13) - but I will loose the power of /calendar (?) - and it will change every day... Any ideas or other suggestions? Best wishes, Georg.0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
Does creating a mobile site in html5 create duplicate content?
We are creating a mobile site in html5 to serve smartphones only. On a seperate domain, m.example.com. From what I have read Google treats smartphones as desktops due to thier advanced web browser capabilities. So no need to bother with googlebot.mobile right? Googlebot should index the site once I create a normal sitemap.xml. My concern is that the mobile site pulls the same content as the main site which is already indexed. Would this not create duplicate content?
Technical SEO | | sfseo0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0 -
About duplicate content
Hi i'm a new guy around here, but i'm having this problem in my website. Using de Seomoz tools i ran a camping to my website, in results i get to many errors for duplicate conten, for example, http://www.mysite/blue/ http://www.mysite/blue/index.html, so my question is, what is the best way to resolve this problem, use a 301 or use the rel canonical tag? Wich url will be consider for main url, Thanks for yor help.
Technical SEO | | NorbertoMM0