How to Target Other Countries Using TLDs?
-
I would like to know if it is possible (and beneficial) to target other countries using country-based TLDs?
When visiting a company website for instance, you often get redirected to your country's site. For instance, when you visit cafepress.com from Canada, you get redirected to cafepress.ca.
Since both websites (cafepress.com and cafepress.ca) have the same content, how they get away with it with no duplicate content issues?
-
Hi Stephane,
For just one or two pages, targeting different countries, on-page content might prove to be sufficient. That is, a page about companies in X field in the UK, listing UK companies with their addresses and telephone numbers, will give Google a range of signals to indicate that that content is most relevant to people from the UK.
That said, the page itself should not contain href lang information to indicate that it is for a Canadian market. If href lang information is included, it should specify the UK.
If the content is sitting on a .ca domain, it will be harder to show that the UK review page is for the UK - it would be better to place this sort of information on a generic TLD website.
The question of duplicate content between .com, .ca, .co.uk etc. sites is answered by geo-targeting, both using the ccTLDs and href lang tags. Google "ignores" duplicate content when the websites' tags tell it that although the content is the same, this version is for Canadians, this version is or Americans and this one over here is for Brits.
Hope this helps.
Cheers,
Jane
-
For different geos suggest the following -
- Use CCTLD
- Host the websites on geo specific servers (US website on a US server)
- Implement href language tags on the geo specific pages to avoid duplicate content issues
- Implement href language tags in the XML sitemap as well
- As a safety measure, implement self canonical tags
- Sajeet
-
Hi Stephane
I would take a look at hreflang and learn that.
To help you speed it up a little:
http://www.stateofdigital.com/hreflang-canonical-test/
Look for other posts by Aleyda as well.
-
-
Every cafepress domains have the exact same content for the most part.
-
Part of my website is a blog and I don't see the use of using a country-based TLD except if I'm going to host it in another country to increase performance.
That said, there's also a directory of companies accompanied by user reviews and various data. I would like to target other countries with this directory by listing only companies from these countries.
How would you suggest to handle this?
-
Some domains are generic, .com .net .org and others are geo targeted. So by geo targeting by TLD is only half the battle. Google states "we'll rely on several signals, including IP address, location information on the page, links to the page, and any relevant information from Google Places". Having an exact replica doesn't make sense but tweaking it to suit the country does.
So in the example provided above I think that they have all those "signals" Google is talking about there so it's two different sites targeting different SERPs. You'll notice that their home page titles are different just for starters, I'm sure they don't have exactly the same sites placed on two different domains.
Read more about this here:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Tools Do I USe To Find Why My Site No Longer Ranks
Hi, I made the mistake of hiring a freelancer to work on my website [in2town.co.uk](link url) but after having a good website things went from bad to worse. The freelancer was kicked off the platform due to lots of compliants from people and creating backdoors to websites and posting on them. It cost me money to have the back door to our site closed. I then found lots of websites were stealing my content through the rss feed. Two of those sites have now been shut down by their hosting company. With all these problems I found in Feb that the hundreds of keywords that I ranked for had vanished. And all the ones that were in the top ten for many years have also vanished. When I create an article which includes https://www.in2town.co.uk/skegness-news/lincolnshire-premier-inn-staff-fear-for-their-jobs/ they cannot be found in Google. Normally before all these problems, my articles were found straight away. If I put in the title name Lincolnshire Premier Inn Staff Fear For Their Jobs and then add In2town in front of it, then instead of the page coming up with the article, it instead shows the home page. Can anyone please advise what tools i should be using to find out the problems and solve them, and can anyone offer advice please on what to do to solve this.
Technical SEO | | blogwoman10 -
Basic SERP report does really gives me useful information?
Hi There! Ive seen so many times in the Basic SERP report, that pages/domains with low values are at the TOP of the SERP report, and i dont really understand that why can this be? I also have the same low values but i cant move higher for a keyword and i thought there is no question, i should improve my content/site, etc... But still dont understand why i see so many times that other domain/pages with lower values are at the top? On the attached screenshot, i am in the 12th position... Thx E. serp_report.jpg
Technical SEO | | Neckermann0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
Technical SEO | | skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
Avoiding duplication in TLDs
I have started a ecom site with following config global version geekwik.com priced in usd india version geekwik.in priced in inr mostly the content in both sites is same (90% same), major difference is currency (and payment gateway) and helpline numbers etc How do I setup robots.txt and google webmaster so that indian users get results from India TLD and global users get results from global TLD and there is no duplication of content. .
Technical SEO | | geekwik0 -
Use of Location Folders
I'd like to understand the pro's and con's of using a location subfolder as an SEO strategy (example: http://sqmedia.us/Dallas/content-marketing.html), where the /Dallas folder is holding all of my keyword rich page titles. The strategy is to get local-SEO benefits from the use of the folder titled /Dallas (a folder which is unnecessary in the over all structure of this site), but how much is this strategy taking away from the page-title keyword effectiveness?
Technical SEO | | sqmedia0 -
Use of Multiple Tags
Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?
Technical SEO | | Gamer070 -
Technical question about site structure using a CMS, redirects, and canonical tag
I have a couple of sites using a particular CMS that creates all of the pages under a content folder, including the home page. So the url is www.example.com/content/default.asp. There is a default.asp in the root directory that redirects to the default page in the content folder using a response.redirect statement and it’s considered a 302 redirect. So all incoming urls, i.e. www.example.com and example.com and www.example.com/ will go to the default.asp which then redirects to www.example.com/ content/default.asp. How does this affect SEO? Should the redirect be a 301? And whether it’s a 301 or a 302, can we have a rel=canonical tag on the page that that is rel=www.example.com? Or does that create some sort of loop? I’ve inherited several sites that use this CMS and need to figure out the best way to handle it.
Technical SEO | | CHutchins1 -
Should you worry about adding geo-targeted pages to your site?
Post-Panda, should I worry about adding a bunch of geo-targeted landing pages at once? It's a community, people have added their location on their profile pages. I'm worried if we decide to make all the locations into hyperlinks that point to new geo-targeted pages, it could get us extra traffic for those geo-specific keyword phrases but penalize the site as a whole for having so many low-quality pages. What I'm thinking is maybe to start small and turn, say, United States into a hyperlink that points to a page (that would house our community members that reside in the United States) and add extra unique content to the page. And only add a new location page when we know we'll be adding unique content to it, so it's not basically just page sorting. Thoughts? Hope that makes sense. Thanks!
Technical SEO | | poolguy0