International web site - duplicate content?
-
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country?
i.e. .uk
-
To avoid duplicate content you need to use the rel=”alternate” href=”x” tag. You dont have to buy a domain, here are several ways you can organise your website.
ccTLDs - [example.ie]
Pros
- Clear geotargeting
- Server location irrelevant
- Easy separation of sites
Cons
- Expensive (and may have limited availability)
- Requires more infrastructure
- Strict ccTLD requirements (sometimes)
****Subdomains with gTLDS [de.example.com]
Pros
- Easy to set up
- Can use Webmaster Tools geotargeting
- Allows different server locations
- Easy separation of sites
Cons
- Users might not recognize geotargeting from the URL alone (is “de” the language or country?)
Subdirectories with gTLDs [example.com/de/]
Pros
- Easy to set up
- Can use Webmaster Tools geotargeting
- Low maintenance (same host)
Cons
- Users might not recognize geotargeting from the URL alone
- Single server location
- Separation of sites harder
URL parameters [site.com?loc=de]
Pros
- Not recommended.
Cons
- URL-based segmentation difficult
- Users might not recognize geotargeting from the URL alone
- Geotargeting in Webmaster Tools is not possible
You can have even the same content for example, if you have a version for the UK and another for the US the content will be very similar.
Hope this helps
-
It's been my experience that if you're going to have additional languages on your site, you can do one of a few things to the URL:
or
or
Then you can place the content in the respective languages on those sites. They won't necessarily be seen as duplicated content (semantics change as you translate to different languages). However, Google won't necessarily penalize you if you keep your URL to site.com and just switch up the languages because, again, semantics change as you translate languages.
I worked with several sites that translated content from English to Spanish or English to Hebrew and they were never once penalized for duplicated content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this considered duplicate content?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: NOTE: the summaries are written by us, and not copied/pasted from other websites. Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Which is Important? Backlinks or Internal Links? For SEO purpose.
Which is Important? Backlinks or Internal Links? For SEO purpose.
White Hat / Black Hat SEO | | BBT-Digital0 -
Pleasing the Google Gods & Not DeIndexing my site.
Hey Mozzers, So plenty of you who follow these threads have come across my posts and have read bits and pieces of the strange dark dark gray hat webspace that I have found myself in. So I'm currently doing some research and I wanted all of your opinion too. Will Google always notify you before they stop indexing your website? Will Google always allow you back if you do get pulled? Does Google give a grace period where they say "fix in 30 days?"? What is every bodies experience with all of this?
White Hat / Black Hat SEO | | HashtagHustler0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
White Hat / Black Hat SEO | | LeverSEO
Jen Davis0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Internal Link Structure
Hello Everyone, I'd be grateful for a little feedback please; This is my site, the home page
White Hat / Black Hat SEO | | TwoPints
of which is targeting the phrase jobs in **** (I'm sure you can fill i the gap
:)) I've made a few changes recently which has included having the
Contract jobs in **** | Permanent Jobs in **** | Temporary Jobs in **** & Today’s
jobs in **** links added to the homepage... Perhaps foolishly and impatiently, I did all of these at the
same time, whilst also changing the sites internal link structure, specifically
for all links to the homepage, which previously were like <a<br>href="/">Home and have now been changed to <a<br>href="/">jobs in ****</a<br></a<br> Meaning that I have 4500 internal links with the anchor text
'jobs in ****' But rather than seeing an improvement n my SERPs ranking, I have
gone from page 2 of Google to page 6, and falling...... Apart from being inpatient, what have I done wrong? Many thanks0