Website in English targeting different countries - is it worth investing in .com?
-
Hi,
I was wondering...
Let's say there is a company in Norway and It sell tours in Norway. Website is only in english, content stays exactly the same for each country (as the website is for people looking for tours in Norway). The domain is registered with .no ccTLD. Main target is USA, Canada and Uk and couple of other countries in Europe. Would the website benefit from having .com instead of .no?
Thanks!
-
The structure is really going to be up to you. The differences in terms of what helps ranking are so very small today that it is better to choose based on user preference and your own.
For consistency, it would be best to get a .co.uk that is on brand, but that route takes a ton of work over time. My favorite route is a subfolder (/uk) off of a gTLD in most instances.
If your .com.au is already ranking well, expect that to keep happening until the UK subsite has had a chance to strengthen. And make sure your content is different!
-
Hi Kate - here's the results - essentially its suggesting a number of options for URL structure similar to ones I am tossing up between but still unsure which is the best option in short or long term?
Keep in Mind:
- The site content in each country must be different.
- Don't use IP detection for country targeting, but ask your customers to set a cookie.
- Only use people native to the country for outreach in order to minimize cultural differences.
Action Items:
- Pick the URL structure for your international growth and stick with it. Keep in mind that the structure needs to include both translations and geo-targeting. We recommend one of the following options:
- ccTLD and subfolder:
www.domain.co.uk/de/product - ccTLD and parameter:
www.domain.co.uk/product?lang=fr - Subdomain and parameter:
ca.domain.com/product?lang=fr - Subdomain and subfolder:
ca.domain.com/fr/product - Subfolder and parameter:
www.domain.com/ca/product?lang=fr
- ccTLD and subfolder:
- Translate your content. Don't machine translate; while manual translation is costly, it's the best for your brand and user experience.
- Put your HREFLANG in XML sitemaps.
- Use the Language Meta tag for Bing translation targeting.
- Set up Google Webmaster Tools Geo-Targeting.
- Set up Bing Webmaster Tools Geo-Targeting.
-
Did you visit that tool? Can you let me know the result?
-
Hey Kate
Really appreciate you taking the time to respond and help out!
So for a business that is currently operating in USA, Canada, Australia, NZ and opening in UK and Germany soon with further expansion on cards how would you tackle this scenario:
I already have a .com.au that ranks #3 on Page1 of Google for highest traffic keyword
I have a .co.nz that ranks #5 on Pg1 for highest traffic keywordNow looking to do do a .co.uk howver someone has pinched it. Do you suggest going for:
.com/uk/
uk.domainname.com (and keep reusing subdomains
Or a .co.uk domain?Just as an FYI and to throw futher spanner in the works.... my .com.au domain ranks on #2 for Page1 of Google UK for my target keyword as its currently not very competetive
Any feedback is helpful!
-
Thank you!
Have a lovely day
-
Yes, but that is if none of your competitors ever move to a gTLD. It's one of many factors. The right thing to do is bite the bullet and move it now. But I would not expect much of a short term gain.
-
To sum up: preferably go with .com. Itself it wouldn't have a big impact but in long run and with good SEO strategy in place it should actually help with ranking.
On the other hand if i go with .no and do the same work, the effect should be generally the same, right?
-
In theory, all else being equal in terms of relevance, page quality, and ranking page strength (which it never is), yes. Your page would be from a site that is not targeting a specific country, so it should be marginally more relevant. However, as stated above, it won't MAKE you rank better for sure. There are a host of other factors.
And yes, it is lack of knowledge and geo-centricity. You all see .no more often, so that is the go to in the other business owners heads. Lack of knowledge is the primary problem though.
-
So lets have this example of another nordic country: Iceland
If you type "tours in iceland" - 90% pages that will come up will have .is ending. Yet they are still ranking internationally. If we were to launch a .com in Iceland, would it have an advantage over the other ones(assuming that everything else is equal)?
Why they are all using ccTLD? Is it lack of knowledge?
-
Actually, if the page is the most relevant to the users query, any TLD can rank well in Google.com, Google.co.uk, etc.
This is not a matter of user preference. It is a matter of indicating the the search engines what your target market is. A ccTLD indicates that you want to only target one country. That is not the case here. But they can rank with a .no to users in the other countries, it is just harder.
-
You can't geo-target a ccTLD to another country outside of the ccTLD's country, so depending on your business needs, you might need a gTLD like .com. Check out this strategy tool and let me know what result you get. I can recommend further from there. http://outspokenmedia.com/international-seo-strategy/
-
Hm, actually no. It's an interesting dilemma, but I would still prefer the .com.
You are too focused on your product - and not enough on your target audience. They are not from Norway - therefore the .no is wasted.
-
Hi Leszek,
Your target markets/audience will not come to google.no or any other search engine with .no extensions to search. Search engines they will be searching on will be with .com, .ca and .co.uk etc. so it certainly makes sense for you to invest in .com or some other generic domains like .tour etc. if available for your industry.
-
Hi LSlversen,
thank you for your response.
But don't you think that people looking to book tours in Norway expect to do it on the website located in Norway (domain-wise)?
-
Thank you Kate for response.
-
I would definitely use the .com domain.
Honestly, in your situation a .no doesn't make that much sense. The site is about Norway, sure, but your language is English and you're targeting basically every other country than Norway. If you're using a ccTLD, it's fair for a consumer to expect the language on the website to match the cc - which is not the case here.
So a .com would absolutely be the way to go, in my opinion.
-
Hey Kate
What would you suggest for a .com.au domain that is now looking to expand into UK?
Would you suggest uk.domainname.com.au or www.domainname.com.au/uk/ ?
Is there a best practise for this?
-
Yes. Simply put, all else being equal, if you are targeting an multiple country international audience in which your offerings do not change, you should have a general TLD rather than a ccTLD. I can't say that you will start ranking better for sure, but it'll help in the long run.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please who can help me rank my website www.shopdocuments.com ?
Hi , i bought my website this month , i built it myself now i want it to grow seo on google pages and stuff tried google ads but they always block my account don't know why so somebody help me please any company? individual willing to work it for me ? even if paid company no problem
Technical SEO | | planetdocs1 -
Relaunching website seo audit
Hi People, We are going to Relaunch the website(https://www.y-axis.com). Url remains the same; the website has good SEO (Ranking, Leads, Traffic...). Website skin (layout, content) is going to change.Hence, would like to know the precautions to secure SEO. Please provide us necessary SEO Checklist for the above request.
Technical SEO | | Anshul.S1 -
Target: blank. Does it make an SEO difference?
I've notice many sites MOZ included no longer use the target: blank attribute. I think that's what it's called. Basically when a link on your site opens a new tab in the browser as opposed to replacing the browser window you are in. Given that MOZ think of everything, I would love to hear opinions on this.
Technical SEO | | wearehappymedia0 -
Different user experience with javascript on/off
I was wondered if the site is serving different user experience when JS is disabled is sort of cloaking
Technical SEO | | John_Smith_0 -
Mysterious drop of website ranking in google
Usually, I don't want to bother anybody by posting silly questions on forums. But this time I really might need advice. My wife and I took over the website maintenance and e-marketing of a local air conditioning company end of March this year. Before that the applied SEO strategies were not very user friendly and a little too search engine focused (spammy keyword stuffed articles, confusing website structure, a lot of directory links). Yesterday night (May 15th) the website more or less stopped ranking. For search terms like "ac repair englewood fl" or "trane north port" and many more the website was on page 1. Here are some more details: I replaced the old website with a newer version end of April. Since some of old the url structure did not apply any longer, I did a setup of around 30 301-redirects in .htaccess. The new site seemed to rank more or less as expected. The homepage has a PakeRank of 1 (seomoz Page Authority is 31). I am working on that but good natural links just take some time. site:kobiecomplete.com still brings up all the pages Google Webmaster Tools notified me on May 12th that there was a possible outage: _"_While crawling your site, we have noticed an increase in the number of transient soft 404 errors around 2012-05-08 16:00 UTC (London, Dublin, Edinburgh). Your site may have experienced outages. These issues may have been resolved. Here are some sample pages that resulted in soft 404 errors:" The listed pages under "some sample pages" are only pages from the old website which do not exist any longer and the 301 redirect was not setup. But this should have been already any issue before, if at all.
Technical SEO | | grojoh
I added the missing 301 redirects and marked them as fixed in Google Webmaster Tools. I had a copy of the website on a testing webspace (root directory of brightsidewg.com). Even though I had robots.txt set to disallow everything and WordPress search engine privacy set to do not index / follow, the website appeared on the Google search results yesterday night instead of the original website (kobiecomplete.com). Even though brightsidewg was a few ranks worse than kobiecomplete.com was, it was still ranking.
To remove the duplicate content, I deleted everything on brightsidewg.com and requested the removal of the website in the Webmaster Tools. Now brightsidewg.com is not any longer indexed (good) but it didn't help the ranking of kobiecomplete.com. Especially the homepage and the service area pages were ranking pretty decent on Google before yesterday night. Now I can not find them at all. Only other less important pages rank on page 8+ No malware on website I did not do any big changes on the website yesterday (only really minor ones). I did not acquire any weird/paid links even though there is a new link from a PageRank 0 website which I did not setup: http://www.indo-karya.com/detail/news/2012/kombise But that alone I think would not be enough for a penalty. It almost looks like that Google applied a partial -950 filter!? I could submit the website for reconsideration to Google and tell them about the duplicate content issue with my testing webspace brightsidewg.com. What do you think about it and what shall I do? Thank you so much for any help!0 -
Worth Changing Redirect From 302 to 301?
Hi, I'm doing an audit on a site that had a redesign in Dec 2009. For some reason I looked to see what kind of redirects were used from the old pages to the current ones, and it looks like they used 302s, which obviously isn't ideal. Given that it's been so long and those pages have looong since been de-indexed, is it worth me suggesting that they change those old redirects to 301s? My thinking is that if those old pages were linked to externally then I should recommend it, but I can't find any link info on Linkscape/OSE, Majestic SEO or YSE. Any comments appreciated.
Technical SEO | | The_Heavies0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0