Targetting site in 3 countries
-
I have read the seomoz post at - http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday before asking the question We recieved a query from one of our client regarding targetting his site in 3 different countries namely - US,UK and Australia. Specifically, he has asked us-
1. Whether i should buy ccTLD like - www.example.co.uk
and write unique content for each of the above.
or
2.
or go for subfolder approach
will it affect SEO if the subfolders are in CAPS.
Would like to have advice of moz community on what advice will be the best.
Thanks
-
I'm going to take the opposite perspective, because I don't think this is a one-size-fits-all situation. Building out unique, ccTLDs does have ranking advantages within those countries, but it also has a couple of disadvantages:
(1) Your marketing efforts and link-building are now all split 3 ways, and your authority is split 3 ways. The again you get from international targeting may not offset what you lose by splitting your SEO efforts. If all 3 markets are mission critical, and you have a large budget, 3 domains has advantages. If one market is much bigger than the other two, though, and you don't have a lot of time and money, I think subfolders are a better choice.
(2) You may have more complex duplicate content issues with similar English content across 3 domains. Google isn't always as good as they should be about isolating international content. Granted, though, this is a problem with subfolders, too. We can say "write unique copy", but you can only say the same thing in the same language so many ways. A few colloquial spellings and phrases aren't going to make for unique content.
-
Yes this will help a bit - will also give your users a faster responding website
-
agreed. Also you can focus more your content, I'm not a native english but probabbly there's some different slangs for each country.
I'm Brazilian and to be honest sometimes I just can't understand what peoples from Portugal say....
-
The client says he has enough budget to go with first approach.
-
Thanks Jan.
My question is does hosting a site in respective country will have an effect, even a small one on the rankings for that particular country.
-
In the end of the day it depends also about budgets, we worked with one client and did 20 different TLD domains for a global website, it is doable if you have budgets.
You need to remember in .com.au market links from .com.au will be way more powerful then links from .co.uk for example.
-
There are off-web considerations also, particularly if you choose to go for multiple domains.
There's the obvious one of the cost of running multiple link-building campaigns.
Then there's the question of what to do with printed materials. Are brochures/leaflets etc to be reprinted for each target market with the localised url? Or is there to be one main url with a country choice page? Which leads on to...
Another issue is what to do if you get one countries' customers on another countries' site (eg a UK visitor on say a US site). Do you redirect them? Or just hope they will notice the country links?
My experience is that there are loads of problematic ramifications. There's a lot of worry about whether to go for domains or subdomains or directories. But it's what happens afterwards that I have found really difficult.
That isn't much of an answer. But it is a warning that the best solution for seo can be quite hard to manage.
-
"Foreign entities can register com.au domain names with an ARBN or a registered trade mark."
-
Best option is to have top level domains e.g. .co.uk and .com.au with unique content on each and host the sites in each country.
With this approach, its easier to get local links, usually requires less links to rank each site plus you get higher click-through rates from the serps.
If you use directories /uk/ and /au/ case does not matter - my pref would be to use lowercase urls
Note: AU has restrictions on domain registrations (have to be an AU business) but you can get someone to register on your behalf.
-
Best way to do it from my experience,
Then you have specific content for each market.
It will work more powerful then using a sub domain method especially if you come up against all .com.au websites in local SERPS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting country specific top level domain as alias - will site benefit from TLDs authority?
I have a host of sites that follow a top level domain strategy. For each local site they will be on the top level domain but with their country-languages prefix as the subdirectory. Such as below: example.com
Intermediate & Advanced SEO | | gracejo
example.com/uk-en
example.com/sg-en
example.com/de-de Each local site being on the TLD will benefit them in terms of SEO and it makes it easier to have one strategy. My question however, if the Netherlands comes on board, they would generally have example.com/nl-en. However they want their primary domain as examplenetherlands.nl and the TLD (example.com/nl-en) set as an alias/secondary domain that redirects to the primary. Will they benefit from any SEO if the TLD is not the primary address?0 -
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
How Submit to different Countries
Hey There, One of My client Needs To rank his site in multiple Location , like He is in Australia But wants To Promote his Site in Canada, india, USA,So What are the things i can do For Rank in Others Country. Please any Expert Help. Thanx
Intermediate & Advanced SEO | | nupuriepl0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
Site structure question
Hello Everyone, I have a question regarding site structure and I would like to mastermind it with everyone. So I am optimizing a website for a Ford Dealership in Boston, MA. The way the site architecture is set up is as follows: Home >>>> New Inventory >>> Inventory Page (with search refinement choices) After you refine your search (lets say we choose a Ford F150 in white) it shows a page with images, price information and specs. (Nothing the bots or users can sink their teeth into) My thoughts are to create category pages for each Ford model with awesome written content and THEN link to the inventory pages. So it would look like this: Home >>> New Inventory >>> Ford 150 Awesome Category Page>>>>Ford F150 Inventory Page I would work hard at getting these category pages to rank for the vehicle for our GEO targeted locations. Here is my questions: Would you be annoyed to first land on a category page with lots of written text, reviews images and videos first and then link off to the inventory page. Or would you prefer to go right from the new inventory page to the actual inventory page and start looking for vehicles? Thanks you so much, Bill
Intermediate & Advanced SEO | | wparlaman0 -
Were small sites hit by Panda?
It seems that primarily large sites were hit by Panda, but does any one know of / own a small site that was hit by Panda?
Intermediate & Advanced SEO | | nicole.healthline0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0