SEOMoz advice on only buying domain if .com version is available
-
RE: "In order to maximize the direct traffic to a domain, it is advised that webmasters should only buy a domain if the .com version is available. "
http://www.seomoz.org/learn-seo/domain
- I am working for a client who's had a domain live for 5 years or so without a .com version of the domain (just .co.uk) - the domain is also hyphenated (which doesn't look like a great idea).
So, just wondering what research has been done into probs caused by lack of .com domain and by using hyphenated domain. I'm trying to figure out whether it would be worth advising client to switch to a new domain.
Your thoughts would be welcome
-
Thanks guys, much appreciated and very useful. I just found Rand's whiteboard on domains and found it quite useful too - see 3: http://www.seomoz.org/blog/how-to-choose-the-right-domain-name
and this on hyphenated domains: http://www.highposition.com/blog/hyphenated-domains-google/ - but it's hard to know. Might set up some of my own tests.
-
It literally depends upon many things! Like if you client’s target market is within UK then I would recommend you to stick up to .co.uk domain as this way you will be able to get better visibility in Google UK plus visitors who are directly coming to your website will tend to trust you!
In my opinion single hyphen is fine if it fits the brand name as well but if you have a domain available that contain no hyphen and at the same time if you can afford a bit of a dip in traffic then you cna go for the new domain and redirect 301 the older domain to the new one but if you are not ready for the traffic and ranking dip then it won’t be a good idea!
Just my 2 cents!
-
I agree with the guys above, it less to do with seo (if any) and more about human error.
I used to help with a uk gaming website that had a lot of american visitors, and I notice over the years people (the old time) would link to "sitename".com instead of .co.uk, which was held by a domain shark, so lost back links.
But I think this is because of an American audience used to everything being .com
Note: ultimately we bought the .com off the domain shark, I contacted him and originally he wanted $1000s for the domain, I said $300 would be the most I would page for it and said good bye. 2 month later he came back to me and sold it for $300. So if you have a domain shark with the .com play the long game with them.
-
I don't think that is is much of an SEO problem as long as you are targeting business in the UK.
We have lots of high ranking .co.uk sites that are unaffected by the .com alternative. We have American suppliers of products who own the .com addresses and therefore we are not in direct competition.
The only time that it could be a problem is if you are physically competing the the .com version and they sell the same product and are targeting the same keywords as you.
Your potential customers may end up buying from the wrong company.
So in my opinion this is a branding issue rather than a Search Engine ranking issue.
-
It really depends which markets your client is trying to target. If their target market is UK only then the .co.uk is perfectly fine. If the .com is available then it would do no harm to purchase it to save a competitor getting hold of it and outranking for the domain/brand name. You could simply redirect the .com to your .co.uk site.
Alternatively if the target is wider than the UK then it becomes increasingly difficult (though not impossible) to rank with a .co.uk in other countries. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White Label Subdomain Competing with Top Level Domain?
Hi All, We have a top level domain that is a comparison site for companies in our industry. We also manage a white label website for a specific company in the same industry, which was originally set up as a subdomain. In other words we have: "example.com" and "companyname.example.com." The sites are treated as separate websites--the subdomain site isn't filling a role like a subfolder would. It has it's own branding, navigation/url structure, etc. Since these sites are in the same industry, there is obviously a huge overlap in the keywords we want each to rank for. In fact 100% of the keywords for the subdomain, are targets for the top level domain. My question is, are we hurting ourselves in google rankings by having two sites under the same top level domain competing for the same keywords? We want both sites to be as successful as possible. Would we be better served by kicking the subdomain out into a new top level domain? Thanks!
Algorithm Updates | | Rodrigo-DC0 -
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
Hummingbird Advice
We are looking for actionable insights about how to recover from hummingbird. I read the post that suggested checking the Google Webmaster Tools "Manual Action" tab for a notice about unnatural links. I did this and there was no notice. So what other steps can we take to troubleshoot?
Algorithm Updates | | larahill0 -
301'ing away from an exact match domain.
Hi Moz Community! My website gets just over 50% of its traffic from ranking in the top 3 in over 10 countries for my exact match keyword domain. 80% + from keywords related to the exact match domain. We are now looking at doing a to 301 re-direct to a new domain to start a fresh branding to the site to increase scope and expand. This would involve removing the keyword from the homepage and domain entirely . However. Considering all competitors ranking for our main keyword, have the keyword in their domain as either a subdomain to or in their root domain and in their homepage content, would this make ranking without the keyword in domain & content hard? I have found a very similar example that has done so, so I guess the answer to that question is no its not. about 65-70% of our anchor text on our backlinks is for our domain keyword. Can anyone advise how best to go about maintaining rankings after 301ing or how best to go about 301ing to make sure that we can maintain the rankings for our main keyword! Any advise at all would be greatly appreciated, Thanks.
Algorithm Updates | | howiex10 -
Since authorship markup requires a domain email, how can a community website allow users to link their Google+ profile?
It seems that Google now requires authors to have a valid email on the domain. This is easy for the traditional web publication. But what about community websites like SEOmoz? How can a community website allow users to link their Google+ profile? Will community websites like SEOmoz be required to 1. Give all users a domain email 2. Ask users to validate the email address with Google? Seems overly complicated.
Algorithm Updates | | designquotes0 -
HI, i am pro member of Seo moz, i just want to know , How much time take Seomoz for Crawl Diagnostics.
HI, i am pro member of Seo moz, i just want to know , How much time take Seomoz for Crawl Diagnostics. Because last evening i have changes in my website pages as seomoz suggested but i am not getting any changes in Crawl Diagnostics.
Algorithm Updates | | jaybinary0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1