Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url with hypen or.co?
-
Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)?
Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co?
Thank you in advance!
-
Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one:
"Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location."
Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399
So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here.
There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market.
Hope that helps!
-
Thanks for the feedback. Thats the beauty of SEO. The only way to figure out what is the most effective is to try multiple ways and measure. Then, as soon as you get it and have a conclusion, the rules change...

-
At the risk of getting a bunch of thumbs down, between the choices you have specifically asked, I am going to throw in with the .co.
I think the issue is going to be how you promote the site, where you host it and where you get your links from.
If you host it in the USA and build a solid local link building campaign no one is going to have any trouble figuring out where you should be relevant. least of all the major search engines.
The other concern would be when someone tries to type in your url directly. However, There will be a tendency to automatically add an "m" to the end. But will that be any more of a problem then trying to get people to put a hyphen in the right place?
If people really find your site helpful, they'll just bookmark it in my experience.
-
Trust me when I say that I didn't think of the .co because of the Super Bowl ad.
I have heard mixed results on the .co but really haven't seen it in search results but I dont see to many hyphenated urls either. Maybe I will just add a word to the .com? -
They had an ad in the superbowl, I've heard from 5 different clients about if they should buy the .co after that.
-
This link might help as well...
-
Completely disagree with you Korgo the average user doesn't even know there is a .co TLD that exists.
They have been available for a while, I spend a lot of time online through work and play and have never seen a site using one so not sure why you think they will take off if they haven't already despite virtually ever domain seller pushing them heavily last year.
-
I agree with James and would aim for one hyphen on the .com TLD. I did some unscientific user testing in this area and one hyphen was fine, 2 or more was a turn off for the user.
The same users expected a site to be .co.uk (I'm in the UK) or .com and some were confused by the existence of different TLD's wondering where the .co.uk or .com was and thinking the URL might not work without them.
-
I would pick hypenated over anything but .com. I would nt even use .net - .org is the only one I would consider for a true non-profit organisation.
I have some hyphenated domains for ecommerce websites, and have found no big problem with them personally. Of course go with non-hyphenated .com's if you can!
-
I don't like hyphens, but I don't like foreign domain extensions even more (Columbia!) despite what they say about it meaning "company", no, no. They pulled the same stunt with .me it's not on.
It depends how competitive the niche is and how much you want it. I have a feeling EMD won't be as strong in the coming months for long tail searches like this, but for now I guess it will give you the edge, what I'm trying to say is if you don't like the domain don't go with it, follow what you feel is most logical, as that is probably best for long term SEO success.The EMD benefit is nowhere near the same (in my exp) with hyphenated or foreign domains, don't get me wrong they are a benefit, but a .com, .org or net will always outrank (for now).
So in response to your question, If I was you I would buy them both (so comp. can't steal em' later), make them both blogs and get a nice brand-able domain for your business, use the two blogs as feeders for your business.
-
Thanks for your reply.
-
Thanks! I figured two hyphens wouldn't be a good idea but it's sure tempting.
-
According to the book The Art of SEO, my personal SEO bible, if you're not concerned with type-in-traffic, branding or name recognition, you don't need to worry about this. However to build a successful website long term you need to own the .com address and if you then want to use .co then the .com should redirect to it. According to the book, with the exception of the geeky, most people who use the web still assume that .com is all that's available or these are the domains that are most trustworthy. So don't lose traffic by having another address!
-
Hi Joe,
I wont go after 2 hyphens, usually if the .com is not available i go after a .net.
But in your case, i would go with a .co
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching URLs after acquisition to retain domain authority?
Hey everyone! My company just acquired our biggest competitor and we're switching to their platform because they have a better technical structure for SEO--what's the best way to do that, other than a 301 redirect? Can we even rename their domain to ours? How do we ensure we keep both our and their domain authority and SEO juice? Thanks!
Intermediate & Advanced SEO | | genevieveagar0 -
Wildcarding Robots.txt for Particular Word in URL
Hey All, So I know that this isn't a standard robots.txt, I'm aware of how to block or wildcard certain folders but I'm wondering whether it's possible to block all URL's with a certain word in it? We have a client that was hacked a year ago and now they want us to help remove some of the pages that were being autogenerated with the word "viagra" in it. I saw this article and tried implementing it https://builtvisible.com/wildcards-in-robots-txt/ and it seems that I've been able to remove some of the URL's (although I can't confirm yet until I do a full pull of the SERPs on the domain). However, when I test certain URL's inside of WMT it still says that they are allowed which makes me think that it's not working fully or working at all. In this case these are the lines I've added to the robots.txt Disallow: /*&viagra Disallow: /*&Viagra I know I have the solution of individually requesting URL's to be removed from the index but I want to see if anybody has every had success with wildcarding URL's with a certain word in their robots.txt? The individual URL route could be very tedious. Thanks! Jon
Intermediate & Advanced SEO | | EvansHunt0 -
Does a non-canonical URL pass link juice?
Our site received a great link from URL A, which was syndicated to URL B. But URL B is canonicalized to URL A. Does the link on URL B pass juice to my site? (See image below for a visual representation of my question) zgbzqBy
Intermediate & Advanced SEO | | Choice1 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Can you redirect specific sub domain URLs?
ello! We host our PDFs, Images, CSS all in a sub domain. For the question, let's call this sub.cyto.com. I've noticed a particular PDF doing really well, infact it has gathered valuable external links from high authoritative sites. To top it off, it gets good visits. I've been going back and forth with our developers to move this PDF to a subfolder structure.
Intermediate & Advanced SEO | | Bio-RadAbs
For example: www.cyto.com/document/xxxx.pdf In my perspective, if I move this and set up a permanent redirect, then all the external links the PDF gathered, link juice and future visits will be attributed to the main website. Since the PDF is existing in the subdomain, I can't even track direct visits nor get the link juice. It appears in top position of Google as well. My developer says it is better to keep images, pdf, css in the subdomain. I see his point and an idea I have is to: convert the pdf to a webpage. Set up a 301 redirect from the existing subdomain to this webpage Upload the pdf with a new name and link to it from the webpage, so users can download if they choose to. This should give me the existing rank juice. However, my question is whether you can set up a 301 redirect for just a single subdomain URL to a folder structure URL? sub.cyto.com/xxx.pdf to www.cyto.com/document/xxxx.pdf?0 -
301 Redirection and apostrophes in URLs
Hi I am experiencing trouble getting any redirects with apostrophes in the URLs to 301 redirect in order to eliminate 404 errors. I have tried replacing the instance of the apostrophe in the source URL field to %27 and variations of this but to no avail. The site is a wordpress site (the old URLS are legacies from the old Business Catalyst site) and I am using the redirection plug in. I have gone into some detail with a helpful soul here http://wordpress.org/support/topic/how-to-deal-with-apostrophes-in-source-url but unfortunately to no result. If anyone has any idea how to solve this puzzle I would be grateful for the help. Example: http://www.tesselaars.com/blog/Inside_Flowers/post/Online_Marketing_for_Florists_Part_1%E2%80%93_A_Website_You_Won%27t_Regret/
Intermediate & Advanced SEO | | Seamoose0 -
What is the best way to handle special characters in URLs
What is the best way to handle special characters? We have some URL's that use special characters and when a sitemap is generate using Xenu it changes the characters to something different. Do we need to have physically change the URL back to display the correct character? Example: URL: http://petstreetmall.com/Feeding-&-Watering/361.html Sitmap Link: http://www.petstreetmall.com/Feeding-%26-Watering/361.html
Intermediate & Advanced SEO | | WebRiverGroup0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2