.co vs .com
-
hello Mozzers.
question - does it make a big difference between having a .co vs a .com .
I am tryign to get a URL, with the actual keywords in the URL . for example blackboots.com/
I see that the .com is taken but the .co is available, is it a good idea to buy it?
also what about hyphens in urls - do they hurt or help if you actually have the keywords in the url.
thanks much - you rock,
V
-
Also in regards to your hyphen question. For the same reasons you should try and get the actual non-hyphenated name. For example my company owns rubberstore.com but there is also a rubber-store.com which was out before us and sell radically different products. Our first traffic started arriving quickly for people searching for their product types.
Now if you type rubber-store in Google you'll find our website first and not theirs.
Hope this helps
-
so inessense i shuld not buy the .co domain and try some other closely related keyword and get a .com correct.
Thank you so much by the way.
I appreciate it.
-
If you register blackboots.co and make it a huge success and I own blackboots.com... then I am going to enjoy a nice amount of your traffic.
If I felt that I could make a significant site for "black boots"... I would be willing to pay a good price for blackboots.com. If I could not afford to buy it I would create a site on another .com domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CNAME record for WWW to Non-WWW Vs. 301?
I was just chatting with the person who set up our site on our domain hosting and they said they added in a CNAME record to transfer the www version of my site to the Non-www version. Shouldn't this be set up as a 301 redirect? I have hundreds of links built to the www version and only a few to the non-www version. Or Could I just add in a 301 in addition to the CNAME record? This is not my wheelhouse and need a little advice. Thanks in Advance.
Intermediate & Advanced SEO | | photoseo10 -
P.O Box VS. Actual Address
We have a website (http://www.delivertech.ca) that uses a P.O Box number versus an actual address as their "location". Does this affect SEO? Is it better to use an actual address? Thanks.
Intermediate & Advanced SEO | | Web3Marketing870 -
Domaim.com/jobs?location=10 is indexed, so is domain.com/jobs/sheffield
Whats the best way you'd tackle that problem? I'm inheriting a website and the old devs had multiple internal links pointing to domain.com/jobs?location=10 (plus a ton of other numbers assigned to locations) and so they've been indexed. I usually use WMTs parameter tool but I'm not sure what the best approach would be other than that. Any help would be appreciated!
Intermediate & Advanced SEO | | jasondexter0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Short Url vs Medium Urls ?
Hello Moooooooooooz ! I got a SEO fight today and though the best would be to involve more people into the fight ! 😛 Do you think it's better to get A- company.com/services/service1.html or B- company/service1.html I was for A as services is also googled to find the service1. I also think that it's better to help google to understand where the service is on the website My friend was for B as URL has to stay as short as possible What do you think ? ps: I can create the URL I want using Joomla and Sh404. The websites has 4 different categoies: /about, /services/ products, /projects Tks ! 🙂
Intermediate & Advanced SEO | | AymanH0 -
For a mobile website, is it better to use a 301 vs. a 302 redirect?
We are vetting a vendor for our mobile website and they are recommending using a 302 redirect with rel=canonical vs. a 301 redirect due to 301 caching issues. All the research I've done shows that a 301 is by far the better way to go do to proper indexing, which in turn will enhance our page authority. Thoughts on why a 302 would be a better fit than a 301 on our mobile site?
Intermediate & Advanced SEO | | seohdsupply1 -
Canonical tag vs 301
What is the reason that 301 is preferred and not rel canonical tag when it comes to implementing redirect. Page rank will be lost in both cases. So, why prefer one over the other ?
Intermediate & Advanced SEO | | seoug_20050 -
Image Links Vs. Text Links, Questions About PR & Anchor Text Value
I am searching for testing results to find out the value of text links versus image links with alt text. Do any of you have testing results that can answer or discuss these questions? If 2 separate pages on the same domain were to have the same Page Authority, same amount of internal and external links and virtually carry the same strength and the location of the image or text link is in the same spot on both pages, in the middle of the body within paragraphs. Would an image link with alt text pass the same amount of Page Authority and PR as a text link? Would an image link with alt text pass the same amount of textual value as a text link? For example, if the alt text on the image on one page said "nike shoes" and the text link on the other page said "nike shoes" would both pass the same value to drive up the rankings of the page for "nike shoes"? Would a link wrapped around an image and text phrase be better than creating 2 links, one around the image and one around the text pointing to the same page? The following questions have to do with when you have an image and text link on a page right next to each other, like when you link a compelling graphic image to a category page and then list a text link underneath it to pass text link value to the linked-to page. If the image link displays before the text link pointing to a page, would first link priority use the alt text and not even apply the anchor text phrase to the linked page? Would it be best to link the image and text phrase together pointing to the product page to decrease the link count on the page, thus allowing for more page rank and page authority to pass to other pages that are being linked to on the page? And would this also pass anchor text value to the link-to page since the link would include an image and text? I know that the questions sound a bit repetitive, so please let me know if you need any further clarification. I'd like to solve these to further look into ways to improve some user experience aspects while optimizing the link strength on each page at the same time. Thanks!
Intermediate & Advanced SEO | | abernhardt
Andrew0