.co what is it? Do I need it? Does Google hate it?
-
Do .com rank better then .co? I don't know much about .co so I'm just looking for some insight! Thanks in advance.
-
Thanks to EVERYONE who gave me their 2 cents. I first heard of the ".co" when I saw a commercial for Overstock.co. It's interesting that they would start using this. I think that that says something. Now I'm not saying it will replace the .com but it's something to consider when a company like OverStock makes a .co purchase and campaign. Thanks to everyone! I'll be sure to post anymore info I read about it!
-
Correct. Getting into the public consciousness will be a much harder task than simply getting people to target the right territories in WMT.
I imagine this'll be more like .asia which is just a bit too generic to have really stuck with anyone.
.co.uk for example is well ingrained into the UK public's mind, but for just a .co ... I just don't see it happening. .xxx on the other hand.
-
Bottom line: Google is treating it as a valid global TLD, but it will be a while (if ever) before people accept it as authoritative as a .com.
-
haha don't worry, if you get proven wrong as much as I do, you get used to it
-
Then I shall eat my words
and go back to finishing that article
-
Found it... http://www.seomoz.org/q/co-domains-any-thoughts
-
Oh, I've seen it being marketed as such, I just don't think it will ever get the traction required to be treated as a gTLD by search engines.
It's all just marketing spiel - http://www.cointernet.co/frequently-asked-questions/general-co-faqs#q5
I'll do a search but I don't think I'm wrong (though have been proven otherwise before :D).
-
-
It's on here somewhere that I read it... no idea who was commenting but perhaps findable by search.
-
Yeah, I refuse to believe .co is going to become a gTLD rather than a ccTLD. Do people still try and sell .ws as dot website?
I bought one as well to use as a URL shortener
-
Depends if you're targeting Columbia or not (.co is the Columbian TLD, which I think is a bit tricky considering all the .co.CC domains out there, though I guess you get .com.CC addresses too).
If you're not targeting Columbia specifically then I'd definitely go with a .com address over a .co TLD.
-
.co is the top-level domain for the country of Colombia.
I would get the .com if you are trying to target the USA. Otherwise you will lose valuable type-in traffic to careless people who don't understand .co and have .com fixed in their mind.
-
I've bought one (just the one ;p)
I think it was on here a few weeks ago actually that I read some comments about it. The general consensus seemed to be that it's not going to be seen as another .biz/.info spammy TLD and should be fine... which I was glad to read as I saw that after I'd bought it.
Apparently it's also going to be like a .com in that even though it's technically location based, that will be a factor which is ignored by search engines, so anyone anywhere can have one without losing any value from the geography.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Can Google Crawl This Page?
I'm going to have to post the page in question which i'd rather not do but I have permission from the client to do so. Question: A recruitment client of mine had their website build on a proprietary platform by a so-called recruitment specialist agency. Unfortunately the site is not performing well in the organic listings. I believe the culprit is this page and others like it: http://www.prospect-health.com/Jobs/?st=0&o3=973&s=1&o4=1215&sortdir=desc&displayinstance=Advanced Search_Site1&pagesize=50000&page=1&o1=255&sortby=CreationDate&o2=260&ij=0 Basically as soon as you deviate from the top level pages you land on pages that have database-query URLs like this one. My take on it is that Google cannot crawl these pages and is therefore having trouble picking up all of the job listings. I have taken some measures to combat this and obviously we have an xml sitemap in place but it seems the pages that Google finds via the XML feed are not performing because there is no obvious flow of 'link juice' to them. There are a number of latest jobs listed on top level pages like this one: http://www.prospect-health.com/optometry-jobs and when they are picked up they perform Ok in the SERPs, which is the biggest clue to the problem outlined above. The agency in question have an SEO department who dispute the problem and their proposed solution is to create more content and build more links (genius!). Just looking for some clarification from you guys if you don't mind?
Technical SEO | | shr1090 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
About google Disavow tool
My website is attacked by spammed link method, so should i use Goolge disavow tool to remove that links? And i have an question that when i use google Disavow to remove backlinks, but i still not remove it on the webpage that placed my links. Does Google index that backlink again? or never?
Technical SEO | | magician0 -
Google Rewriting PDF Titles
Has anyone else noticed Google rewriting the title of PDF documents?
Technical SEO | | waynekolenchuk0 -
How to avoid automated queries to Google
We have a search engine marketing team working on different projects and we share same IP. We check our rankings manually on Google, is it sending automated queries to Google? Can it affect our sites? What solution do you suggest to check rankings without sending multiple queries to Google?
Technical SEO | | koamit0 -
Penalized by google. How to find out?
Our webpage performs very bad on some keywords relating to one product. At the SeoMoz-ranking page i can se we are number 9 but we have the highest (higher than our competitors) rating in almost every category (at least 25 of 30) on the keyword difficulty report. How do i find out why this is so, or if we have been penalized by google?On other search-engines (yahoo, bing etc) we are number one! And we have the highest pagerank among the competitors...
Technical SEO | | alsvik0