Top Level Domains
-
Howdy Everyone,
I have a website that will span multiple countries. The content served will be different for each country. As such, I've acquired the top level domains for different countries.
I want to map the cop level domains (e.g. domain.co.uk) to uk.domain.com for development purposes (LinkedIn does this).
I'm curious to know whether this is adviseable and if mapping a country-specific TLD to a subdomain will maintain local SEO value.
Thanks!
-
Thanks guys, great insights!
- I do have multiple ccTLDs for the same site. The content for each, however, will be significantly different.
- By 'domain-mapping' I meant actually getting into the DNS records and mapping the ccTLD URL to a sub-domain
- Rel canonical redirect: I'm assuming that the .com.uk would be the canonical page? If this page is the canonical page, and the com/uk/ is the 'discounted' page, what happens if the rest of the site uses the .com/uk convention? (In other words, is it advisable to have this inconsistency [both from a usability, index point-of-view]?)
@Gary
I think this is a very interesting point. I agree with both of you that if I saw a billboard for domain.com/uk, I might think it to be slightly odd. However, I'm not sure if consistency trumps familiarity or not.
Further down the rabbit-hole:
I will have multiple languages (let's say en, fr, es). I want this to utilise sub-directories (I want to avoid super-fancy AJAX whatnot. I HATE Google's help page URLs, for instance).
domain.com/us/en/
domain.com/us/es/The idea here is that the site rank for multiple languages, within a country (without creating super-duper long URLs). Any ideas/tips?
Maybe a quick outline might help:
1 - Main (sort of a splash/navigation page)
1 - USA
1 - EN
2 - ES
2 - UK
1 - EN
3 - France
1 - FR
2 - EN
-
Gary has a point that considering offline marketing is important in many situations. Seeing .co.uk instead of /uk definitely gives a more local feel.
Great response anyway, to follow on from that:
If you want to use www.domain.co.uk in offline marketing then you may want to consider using a rel canonical redirect. It may be a bit more time consuming to set up (there are different ways you could go about doing this) but it may help from a conversion point of view. Probably minimal I admit, but people generally don't like being redirected.
Either way, I wouldn't worry about what domain they use as a linking source if your redirects are set up correctly.
-
I agree that directories would be a better way to organise your content.
I would aim to get people to use www.domain.com/uk etc. as a linking source, but potentially still use www.domain.co.uk in offline marketing and use 301 redirects to www.domain.com/uk. If that makes any sense?
The TLD will certainly have offline local value even if it doesnt have SEO benefit.
-
If I read your first paragraph right, you have multiple ccTLDs for the same company? That will make your SEO efforts a lot more difficult and is only really appropriate in rare cases (Amazon for instance).
To make your life a lot easier, I would suggest using directories instead. i.e. domain.com/us, domain.com/au, domain.com/uk as more PageRank passes from the root level to these directories than from root level to the subdomain. Then it comes down to geo-targeting.
To sum up:
Directories > Subdomains > Unique Domains (purely in terms of SEO).
When you say you want to map the top level domains, do you mean redirect domain.co.uk to uk.domain.com? afaik, redirecting .co.uk to uk.domain would not retain any local UK SEO value directly - other UK signals may come with the redirect though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi everyone Just out of curiosity, what would happen if in my disavow report I have this line : domain:example.com**/** instead of domain:example.com as recommended by google. I was just wondering if adding a / at the end of a domain would automatically render the line invalid and ignored by Google's disavow backlinks tool. Many thanks for your thoughts
Technical SEO | | LabeliumUSA0 -
2 domains for one website
Hi, I was wondering: i have 2 domains. One is the name of the company the other one is an exact match domain. There is only one website. If I use both domains for this website which one will rank? How google react to website configured like that? Regards
Technical SEO | | LeszekNowakowski0 -
Redirecting a Few URLS from One Domain to Another
Hello, I have two websites within a similar niche...some of the top organic traffic driving pages on Website B I'd like to redirect to a similar page on Website A. The reason is Website A is a bigger and better and is monetized much better as well. I only want to redirect a few of the main URLS on Website A and also only those that I have similar content on my main Website B. Is this process safe for SEO? What is the best way to go about this process. I am not really concerned with Website B and what happens to it's rankings, but in the meantime, I'd like to redirect the traffic from some of it's main organic traffic driving pages to my main website A and to it's similar pages. I am also concerned with making sure my main website A stays white hat and doesn't receive any negativity from these redirects. Thanks.
Technical SEO | | juicyresults0 -
How does robots.txt affect aliased domains?
Several of my sites are aliased (hosted in subdirectories off the root domain on a single hosting account, but visible at www.theSubDirectorySite.com) Not ideal, I know, but that's a different issue. I want to block bots from viewing those files that are accessible in subdirectories on the main hosting account, www.RootDomain.com/SubDirectorySite/, and force the bots to look at www.SubDirectorySite.com instead. I utilized the canonical meta tag to point bots away from the sub directory site, but I am wondering what will happen if I use robots.txt to block those files from within the root domain. Will the bots, specifically Google bot, still index the site at its own URL, www.AnotherSite.com even if I've blocked that directory with Disallow: /AnotherSite/ ? THANK YOU!!!
Technical SEO | | michaelj_me0 -
Two domains One hosting account
I have two separate domains set up on one hosting account through godaddy. Will this affect the SEO negatively? They are completely different sites with their own unique domains but on my local files the second domain is set up as a subfolder for the original hosted domain. Any thoughts? Thanks in advance.
Technical SEO | | DavetheExterminator0 -
Keyword domains
Hi everyone. Two questions regarding keyword domains (e.g. "widgets.com") If we have to choose a domain with an extra word, does it make a difference to have the added word before or after? E.g. "my-widgets.com" vs "widgets-now.com" Does it make a difference if the extra word is a generic vs a 'real' word? E.g. "my-widgets.com" vs "japanese-widgets.com" Thanks a lot for your feedback!
Technical SEO | | hectorpn0 -
Moving Blog to Custom Domain
Hi All - I am currently attempting to move a blog (companyname.blogspot.com) that has a 2/10 GPR to a custom domain name so that I can get my targeted keywords in the domain name. The new domain will just be www.targetedkeyword.com. There are easy instructions for doing this with the A record and CNAME changes in the Blogger help, but I am worried about losing rank and links if I do this. Can you guys help me understand what the ramifications would be, and how I can accomplish this without losing the 2/10 mojo? Thanks!
Technical SEO | | Bandicoot0 -
Why Google did not index our domain?
Hi, We launched tmart 60 days ago and submitted to google, bing, yahoo 20 days later. But google had never indexed our website still when yahoo indexed it in one week. What we have checked or tried: 1. We got 20~50 inlinks in one month and now 81 inlinks via yahoo site explorer. 2. This domain has registered for 13 years and we purchased it from sedo last year. We
Technical SEO | | zt673
did not find any problems from domain archive pages. 3. Page similar: the homepage is 50% similar to one of our competitors when we just launched.
So we adjusted the page structure and modified the content one month later and decreased the similarity to 30% (by tools from webconfs.com) 4. Google Robots: googlebot crawled our website every day after we submitted for indexing.
We opened GWT account for it and added the xml sitemap last week. GWT said nothing
was wrong except the time of page loading. Our questions: Why google did not indexed our website? What should we do? Thanks, wu0