How to Target Other Countries Using TLDs?
-
I would like to know if it is possible (and beneficial) to target other countries using country-based TLDs?
When visiting a company website for instance, you often get redirected to your country's site. For instance, when you visit cafepress.com from Canada, you get redirected to cafepress.ca.
Since both websites (cafepress.com and cafepress.ca) have the same content, how they get away with it with no duplicate content issues?
-
Hi Stephane,
For just one or two pages, targeting different countries, on-page content might prove to be sufficient. That is, a page about companies in X field in the UK, listing UK companies with their addresses and telephone numbers, will give Google a range of signals to indicate that that content is most relevant to people from the UK.
That said, the page itself should not contain href lang information to indicate that it is for a Canadian market. If href lang information is included, it should specify the UK.
If the content is sitting on a .ca domain, it will be harder to show that the UK review page is for the UK - it would be better to place this sort of information on a generic TLD website.
The question of duplicate content between .com, .ca, .co.uk etc. sites is answered by geo-targeting, both using the ccTLDs and href lang tags. Google "ignores" duplicate content when the websites' tags tell it that although the content is the same, this version is for Canadians, this version is or Americans and this one over here is for Brits.
Hope this helps.
Cheers,
Jane
-
For different geos suggest the following -
- Use CCTLD
- Host the websites on geo specific servers (US website on a US server)
- Implement href language tags on the geo specific pages to avoid duplicate content issues
- Implement href language tags in the XML sitemap as well
- As a safety measure, implement self canonical tags
- Sajeet
-
Hi Stephane
I would take a look at hreflang and learn that.
To help you speed it up a little:
http://www.stateofdigital.com/hreflang-canonical-test/
Look for other posts by Aleyda as well.
-
-
Every cafepress domains have the exact same content for the most part.
-
Part of my website is a blog and I don't see the use of using a country-based TLD except if I'm going to host it in another country to increase performance.
That said, there's also a directory of companies accompanied by user reviews and various data. I would like to target other countries with this directory by listing only companies from these countries.
How would you suggest to handle this?
-
Some domains are generic, .com .net .org and others are geo targeted. So by geo targeting by TLD is only half the battle. Google states "we'll rely on several signals, including IP address, location information on the page, links to the page, and any relevant information from Google Places". Having an exact replica doesn't make sense but tweaking it to suit the country does.
So in the example provided above I think that they have all those "signals" Google is talking about there so it's two different sites targeting different SERPs. You'll notice that their home page titles are different just for starters, I'm sure they don't have exactly the same sites placed on two different domains.
Read more about this here:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
SEO implications when changing the business function however using the same domain name
Hi Everyone, Request some advice on the following situation please: If a client wishes to start a new business (advertising agency) using a domain name that they previously used for a completely different business function (selling hats & t-shirts) Is there anything you can do to "clean" the domain so that the previous indexing of the domain does not effect the new business and give the client a fresh start. Any help or advice would be greatly appreciated. Regards and thanks.
Technical SEO | | Republica0 -
Use hreflang for language and regional URLs
I have implemented hreflang on site, seen here http://www.cobaltrecruitment.com/ but Webmaster Tools is returning loads of errors in the international targeting area..... "'“en-sg"' - unknown language code" and "'“en-ar"' - unknown language code" Can anyone suggest what I need to tell my developers to do? Thanks for your help!
Technical SEO | | the-gate-films0 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
Here's the situation... There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.) Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product I've seen traffic to my site dropping but I don't have a warning in GWT. These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try). I totally understand that the site linking to me may not have any affect on my current traffic. So should I use the Disavow tool to make sure this site isn't counting against me?
Technical SEO | | GlenCraig0 -
Which Pagination/Canonicalization Page Selection Approach Should be Used?
Currently working on a retail site that has a product category page with a series of pages related to each other i.e. page 1, page 2, page 3 and Show All page. These are being identified as duplicate content/title pages. I want to resolve this through the applications of pagination to the pages so that crawlers know that these pages belong to the same series. In addition to this I also want to apply canonicalization to point to one page as the one true result that rules them all. All pages have equal weight but I am leaning towards pointing at the ‘Show All’. Catch is that products consistently change meaning that I am sometimes dealing with 4 pages including Show All, and other times I am only dealing with one page (...so actually I should point to page 1 to play it safe). Silly question, but is there a hard and fast rule to setting up this lead page rule?
Technical SEO | | Oxfordcomma0 -
Optimizing a website which uses JavaScript and jQuery
Just a quick question (or 2) If I have divs which are hidden on my page, but are displayed when a user clicks on a p tag and the hidden div is displayed using jquery a user clicks on an a tag and the hidden div is displayed using jquery with the href being cancelled in both examples, will the hidden content be optimized, or will the fact it is initially hidden make it harder to optimize? Thanks for any answers!
Technical SEO | | PhatJP0 -
Language/country redirect best practice?
Hi, What is SEO best practice when it comes to redirecting users from www.domain.com to their specific language/country, let's say www.domain.com/de for Germany? From what I heard in on of the whiteboard fridays, it seems to be Javascript based on IP and browser language, and then set a cookie - correct? Or should we let our users manually select their language/country at the first visit? Any suggestion appreciated, thanks!
Technical SEO | | rtora0