How to Target Other Countries Using TLDs?
-
I would like to know if it is possible (and beneficial) to target other countries using country-based TLDs?
When visiting a company website for instance, you often get redirected to your country's site. For instance, when you visit cafepress.com from Canada, you get redirected to cafepress.ca.
Since both websites (cafepress.com and cafepress.ca) have the same content, how they get away with it with no duplicate content issues?
-
Hi Stephane,
For just one or two pages, targeting different countries, on-page content might prove to be sufficient. That is, a page about companies in X field in the UK, listing UK companies with their addresses and telephone numbers, will give Google a range of signals to indicate that that content is most relevant to people from the UK.
That said, the page itself should not contain href lang information to indicate that it is for a Canadian market. If href lang information is included, it should specify the UK.
If the content is sitting on a .ca domain, it will be harder to show that the UK review page is for the UK - it would be better to place this sort of information on a generic TLD website.
The question of duplicate content between .com, .ca, .co.uk etc. sites is answered by geo-targeting, both using the ccTLDs and href lang tags. Google "ignores" duplicate content when the websites' tags tell it that although the content is the same, this version is for Canadians, this version is or Americans and this one over here is for Brits.
Hope this helps.
Cheers,
Jane
-
For different geos suggest the following -
- Use CCTLD
- Host the websites on geo specific servers (US website on a US server)
- Implement href language tags on the geo specific pages to avoid duplicate content issues
- Implement href language tags in the XML sitemap as well
- As a safety measure, implement self canonical tags
- Sajeet
-
Hi Stephane
I would take a look at hreflang and learn that.
To help you speed it up a little:
http://www.stateofdigital.com/hreflang-canonical-test/
Look for other posts by Aleyda as well.
-
-
Every cafepress domains have the exact same content for the most part.
-
Part of my website is a blog and I don't see the use of using a country-based TLD except if I'm going to host it in another country to increase performance.
That said, there's also a directory of companies accompanied by user reviews and various data. I would like to target other countries with this directory by listing only companies from these countries.
How would you suggest to handle this?
-
Some domains are generic, .com .net .org and others are geo targeted. So by geo targeting by TLD is only half the battle. Google states "we'll rely on several signals, including IP address, location information on the page, links to the page, and any relevant information from Google Places". Having an exact replica doesn't make sense but tweaking it to suit the country does.
So in the example provided above I think that they have all those "signals" Google is talking about there so it's two different sites targeting different SERPs. You'll notice that their home page titles are different just for starters, I'm sure they don't have exactly the same sites placed on two different domains.
Read more about this here:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Is it a good idea to use the rel canonical tag to refer to the original source?
Sometimes we place our blog post also on a external site. In this case this post is duplicated. Via the post we link to the original source but is it also possible to use the rel canonical tag on the external site? For example: The original blogpost is published on http://www.original.com/post The same blogpost is published on http:///www.duplicate.com/post. In this case is it wise to put a rel canonical on http://www.duplicate.com/post like this: ? What do you think? Thanks for help! Robert
Technical SEO | | Searchresult0 -
Does anyone know how set up and use google plus for business
Hi i am trying to work out how to use google plus to increase brand awareness and to increase traffic to my site, but i am not sure how to do this. can anyone please give me step by step instructions on setting it up and using it to generate traffic please
Technical SEO | | ClaireH-1848860 -
Developing a drop down menu: Do I use javascript or pure css?
I am developing a drop down menu and am trying to decide if using javascript instead of just css is worth it. I've done some research on the topic and the opinions seem dated. I know that at one time not using javascript for a drop down menu was important but now less so. Google constantly says that they will not discount the links just because they are not shown until javascript is ran. What I want to know is has anyone discovered from testing that using javascript instead of css for a drop down makes a difference? Note: the links will not be located in an external javascript file.
Technical SEO | | seozachz0 -
Canonical tag used on several pages?
Is it a bad idea to use rel=canonical from several pages back to one (if you are planning on no-indexing them)? Does this concentrate the “link juice” from those several pages back to one?
Technical SEO | | nicole.healthline0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0