Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url with hypen or.co?
-
Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)?
Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co?
Thank you in advance!
-
Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one:
"Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location."
Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399
So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here.
There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market.
Hope that helps!
-
Thanks for the feedback. Thats the beauty of SEO. The only way to figure out what is the most effective is to try multiple ways and measure. Then, as soon as you get it and have a conclusion, the rules change...
-
At the risk of getting a bunch of thumbs down, between the choices you have specifically asked, I am going to throw in with the .co.
I think the issue is going to be how you promote the site, where you host it and where you get your links from.
If you host it in the USA and build a solid local link building campaign no one is going to have any trouble figuring out where you should be relevant. least of all the major search engines.
The other concern would be when someone tries to type in your url directly. However, There will be a tendency to automatically add an "m" to the end. But will that be any more of a problem then trying to get people to put a hyphen in the right place?
If people really find your site helpful, they'll just bookmark it in my experience.
-
Trust me when I say that I didn't think of the .co because of the Super Bowl ad. I have heard mixed results on the .co but really haven't seen it in search results but I dont see to many hyphenated urls either. Maybe I will just add a word to the .com?
-
They had an ad in the superbowl, I've heard from 5 different clients about if they should buy the .co after that.
-
This link might help as well...
-
Completely disagree with you Korgo the average user doesn't even know there is a .co TLD that exists.
They have been available for a while, I spend a lot of time online through work and play and have never seen a site using one so not sure why you think they will take off if they haven't already despite virtually ever domain seller pushing them heavily last year.
-
I agree with James and would aim for one hyphen on the .com TLD. I did some unscientific user testing in this area and one hyphen was fine, 2 or more was a turn off for the user.
The same users expected a site to be .co.uk (I'm in the UK) or .com and some were confused by the existence of different TLD's wondering where the .co.uk or .com was and thinking the URL might not work without them.
-
I would pick hypenated over anything but .com. I would nt even use .net - .org is the only one I would consider for a true non-profit organisation.
I have some hyphenated domains for ecommerce websites, and have found no big problem with them personally. Of course go with non-hyphenated .com's if you can!
-
I don't like hyphens, but I don't like foreign domain extensions even more (Columbia!) despite what they say about it meaning "company", no, no. They pulled the same stunt with .me it's not on.
It depends how competitive the niche is and how much you want it. I have a feeling EMD won't be as strong in the coming months for long tail searches like this, but for now I guess it will give you the edge, what I'm trying to say is if you don't like the domain don't go with it, follow what you feel is most logical, as that is probably best for long term SEO success.The EMD benefit is nowhere near the same (in my exp) with hyphenated or foreign domains, don't get me wrong they are a benefit, but a .com, .org or net will always outrank (for now).
So in response to your question, If I was you I would buy them both (so comp. can't steal em' later), make them both blogs and get a nice brand-able domain for your business, use the two blogs as feeders for your business.
-
Thanks for your reply.
-
Thanks! I figured two hyphens wouldn't be a good idea but it's sure tempting.
-
According to the book The Art of SEO, my personal SEO bible, if you're not concerned with type-in-traffic, branding or name recognition, you don't need to worry about this. However to build a successful website long term you need to own the .com address and if you then want to use .co then the .com should redirect to it. According to the book, with the exception of the geeky, most people who use the web still assume that .com is all that's available or these are the domains that are most trustworthy. So don't lose traffic by having another address!
-
Hi Joe,
I wont go after 2 hyphens, usually if the .com is not available i go after a .net.
But in your case, i would go with a .co
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with facet URLs in Magento
Hi Guys, Wondering if I can get some technical help here... We have our site britishbraces.co.uk , built in Magento. As per eCommerce sites, we have paginated pages throughout. These have rel=next/prev implemented but not correctly ( as it is not in is it in ) - this fix is in process. Our canonicals are currently incorrect as far as I believe, as even when content is filtered, the canonical takes you back to the first page URL. For example, http://www.britishbraces.co.uk/braces/x-style.html?ajaxcatalog=true&brand=380&max=51.19&min=31.19 Canonical to... http://www.britishbraces.co.uk/braces/x-style.html Which I understand to be incorrect. As I want the coloured filtered pages to be indexed ( due to search volume for colour related queries ), but I don't want the price filtered pages to be indexed - I am unsure how to implement the solution? As I understand, because rel=next/prev implemented ( with no View All page ), the rel=canonical is not necessary as Google understands page 1 is the first page in the series. Therefore, once a user has filtered by colour, there should then be a canonical pointing to the coloured filter URL? ( e.g. /product/black ) But when a user filters by price, there should be noindex on those URLs ? Or can this be blocked in robots.txt prior? My head is a little confused here and I know we have an issue because our amount of indexed pages is increasing day by day but to no solution of the facet urls. Can anybody help - apologies in advance if I have confused the matter. Thanks
Intermediate & Advanced SEO | | HappyJackJr0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Does rewriting a URL affect the page authority?
Hi all, I recently optimized an overview page for a car rental website. Because the page didn’t rank very well, I rewrote the URL, putting the exact keyword combination in it. Then I asked Google to re-crawl the URL through Search Console. This afternoon, I checked Open Site Explorer and saw that the Page Authority had decreased to 1, while the subpages still have an authority of about 18-20. Hence my question: is rewriting a URL a bad idea for SEO? Thank you,
Intermediate & Advanced SEO | | LiseDE
Lise0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Duplicate Titles caused by multiple variations of same URL
Hi. Can you please advise how I can overcome this issue. Moz.com crawle is indicating I have 100's of Duplicate Title tag errors. However this is caused because many URL's have been indexed multiple times in Google. For example. www.abc.com
Intermediate & Advanced SEO | | adhunna
www.abc.com/?b=123 www.abc.com/ www.abc.com/?b=654 www.abc.com/?b=875 www.abc.com/index.html What can I do to stop this issue being reported as duplictae Titles, as well as content? I was thinking maybe I can use Robots.txt to block various query string parameters. I'm Open to ideas and examples.0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
Brackets in a URL String
Was talking with a friend about this the other day. Do Brackets and or Braces in a URL string impact SEO? (I know short human readable etc... but for the sake of conversation has anyone relaised any impacts of these particular Characters in a URL?
Intermediate & Advanced SEO | | AU-SEO0 -
Blocking Dynamic URLs with Robots.txt
Background: My e-commerce site uses a lot of layered navigation and sorting links. While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google. For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do? Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed. Is there a better way to do this, or is this a good solution? Thank you!
Intermediate & Advanced SEO | | AndrewY1