How to make a .co.uk work in the US
-
Hi
Anyone out there got any tips on the best way to get a .co.uk working well in Google US? We have a Travel site working fantastically well in the UK for some very competitive keywords. It is ranking okay for the same keywords in US but nothing particularly great. Any tips on bespoke activity to drive the ranking for the site in the US without undermining the UK rankings.
Thanks
Pete
-
Hi Pete,
both answers by Greg and Malcom are correct.
More over, take into account that any territorial domain name, as the .co.uk, automatically geo-target the given country territory. That is the reason why .co.uk sites do not perform well on Google.com
The best solution is to have separate domain names for the two territories and implement the use of the rel=alternate hreflang in order to suggest to Google what URL show in the SERPs of the two domain, being both in English.
Be aware that - even though Google sometimes suggests to use that tag along with the cross domain canonical one, that is true just for those pages of both sites, which have exactly the same content.
Said that, remember always to geo-localize the content for US and UK in order to make them different (language, currency, address, phone numbers).
-
I have never seen an example of a .co.uk ranking well in the US market. The logical approach is to use a .com TLD and use Google WMT to set the target to US. You can't set a .co.uk as far as I believe because it's main purpose is to serve the UK market anyway. I have been in situations where a .com and .co.uk have both targeted at the UK and they basically end up cannibalising each other until one wins and one disappears.
Your best bet is to create a US-separate site (with different content!).
In my personal opinion you will never reach #1 with a .co.uk on google.com, I could be mistaken though and having a US IP address, US links pointing in may help.
-
Hi Pete,
I would suggest having a separate US version of the site (ideally on a .com domain matching your .co.uk) rather than trying to target both from the .co.uk domain name. I think (as you've already experienced to an extent) it would be very difficult to target both and could have long-term negative effects on your work in the UK rankings.
Hope that helps.
Thanks, Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Working out whether a site is http and https
Hi there, I can access the following site with http and https making me think that there will be a duplicate content issue. How can I work out if this is the case? http://ionadebarge.com https://ionadebarge.com Thanks.
Technical SEO | | Bee1591 -
Is there software that makes it easier to reach out to websites and webmaster to have toxic links removed?
I'm currently trying to disavow toxic links that I have found on my site, that our previous SEO company created. Google requires that we reach out to the individual websites and try to have them removed. Does anyone know of software that makes this process automated or easer? I'm currently doing it manually, uhg! Also, is there software that can help you find toxic links? I'm currently also doing that manually, uhg! Thanks.
Technical SEO | | milehigh52800 -
Is my robots.txt file working?
Greetings from medieval York UK 🙂 Everytime to you enter my name & Liz this page is returned in Google:
Technical SEO | | Nightwing
http://www.davidclick.com/web_page/al_liz.htm But i have the following robots txt file which has been in place a few weeks User-agent: * Disallow: /york_wedding_photographer_advice_pre_wedding_photoshoot.htm Disallow: /york_wedding_photographer_advice.htm Disallow: /york_wedding_photographer_advice_copyright_free_wedding_photography.htm Disallow: /web_page/prices.htm Disallow: /web_page/about_me.htm Disallow: /web_page/thumbnails4.htm Disallow: /web_page/thumbnails.html Disallow: /web_page/al_liz.htm Disallow: /web_page/york_wedding_photographer_advice.htm Allow: / So my question is please... "Why is this page appearing in the SERPS when its blocked in the robots txt file e.g.: Disallow: /web_page/al_liz.htm" ANy insights welcome 🙂0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0 -
Can anyone recommend a good hosting company in the UK
Hi can anyone recommend some good hosting companies in the UK as my hosting company are now considering charging for technical advice. I use some standard packages as well as a dedicated server and would be grateful if you could give me some examples on prices as well as service which includes technical service.
Technical SEO | | ClaireH-1848860 -
.us domains vs .com - What does Google Think?
Suppose I had 2 domains, carloans.us & carloans.com with exactly the same links profiles, and content (not duplicate but you know what I mean). Would Google favour the .com domain? In my experience, yes. But I might be wrong?
Technical SEO | | Tom-R
Same with other not so standard domains like .biz etc. Am I right to believe that Google can prefer the more common domain extensions?0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0 -
When does it make sense to use no-follow on your own domain?
Hey guys, I'm not too sure if I'm over-thinking this, but I've seen no-follow being used with SEOmoz and I'm looking to implement this myself. Most of my links point to my root domain (yes I'm working on building links to deep pages) so would it make sense to 'limit' or 'no-follow' links on my root domain so that only the most important pages are being passed link juice? Thanks
Technical SEO | | reegs0