Will hyphens in my domain name have a negative impact on my SEO?
-
.
-
.
-
No problem, glad I could help. I am in these Q&A forums to make sure people get their questions answered, so please let me know if you have any more.
-
Alex,
I watched that Webinar and it was REALLY informative. I have a full page of notes and to-do's now that I think will help us really build our brand. Thanks for the suggestion!
-Alex
-
.
-
Not a problem, let me know if there is anything else I can help you with, or if you have any other questions.
-
.
-
.
-
A few thoughts on that Alex.
The first is, don't set up that redirect! In short it won't help you, for reasons such as potential duplicate content, confusing to visitors, and you'll have to choose where to link to...
If you own the original .com, then this is where your main efforts should lie. Directories, happy customers, and industry blogs will be more inclined to link to your companys site if it feels more legit.
Depending on the keyword difficulty, focus on making some linkbait content, be it a free online calculator or tool, or a free PDF download, or just some standard infographis and try distributing all of this content to relevant niche sites and social media.
In answers to your questions then;
1. Hyphens in this case will not have a negative effect on your URL. I believe Google will start clamping down on hyphenated keyword rich domains, as shown by the recent google branding serps update, but the timescale and effectiveness of this is unknown.
2. Don't go the extra mile and 301 the domain, unless you're keen on building links to both sites it's simply not worth it, focus on your main domain and optimise the landing pages for your keywords.
Solar Monster
-
No hyphens will not hurt your seo. That is you won't lose ranking because you have hyphens. But it does look a little spammy and people maybe less likely to click on domains with hyphens.
Yes hypehsn will help if they are in the domain. Mainly becasue when someone links to your domain like rich-keywords.com - the keywords are already in the anchor text. But I also advise away from going after keyword rich urls, they look spammy. Webmasters will also be less likely to get back to you on link requests and partnerships.
I recommend you stay with your company domain and try to optimize that. It also makes sense to have your company name in the url.
-
Hey Alex,
First off, nice name
Anyways, hyphens have shown to not be of any benefit in the search engines. It has really been exact match domains (with no hyphens) that have done the best. It won't negatively impact you, but it won't help either.
What I would do is focus on building a brand. In a recent blog post by Rand here, he had a presentation going over ranking factors and it has been concluded that exact match domains don't have as much importance as they used to.
I would focus on building a brand. If you look at the SERPs today, you will see that brands are continuing to show up, as they have trust with the users, which that is what Google is looking for.
Now to become a brand in Google's eyes was covered in a PRO webinar, which you can find here. This will ultimately be the best solution for you.
That's my opinion, let me know what you think by replying to me.
I look forward to hearing from you.
-Alex
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google penalize 2 sites for targeting "like" keyword phrases?
I own (2) different websites, one an HTML site that has been live for 20 years and a ecommerce site that has been live for 7 years. We sell custom printed (branded) tents for use at trade shows and other indoor and outdoor events. While our ecomm site targets "trade show" tents our HTML site targets "event" tents. I believe that the keyword phrases are dissimilar enough that targeting "trade show tents" on one site and "event tents" on the other should not cause Google to penalize one or the other or both sites for having similar content. The content is different on both sites. I'm wondering if anyone has experience with, or opinions on, my thoughts... either way. Thanks,
Algorithm Updates | | terry_tradeshowstuff
Terry Hepola0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
Hello friends, One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below: http://chakracentral.com/#panelBlock4 (service page)
Algorithm Updates | | chakraseo
http://chakracentral.com/#panelBlock3 (about-us page) I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it. Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below: http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples So please find my queries below for which I need help: 1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website? Regards,
Sarmad Javed0 -
With regards to SEO is it good or bad to remove all the old events from our website?
Our website sells tickets for various events across the UK, we do have a LOT of old event pages on our website which simply say SOLD OUT. What is the best practice? Should these event pages be removed and a 301 redirect added to redirect to the home page? Or should these pages remain in tact with simply SOLD OUT on the page?
Algorithm Updates | | Alexogilvie0 -
Will we no longer need Location + Keyword? Do we even need it at all?
Prepare yourselves. This is a long question. With the rise of schema and Google Local+, do you think Google will now have enough data about where a business is located, so that when someone searches for, a keyword such as "Atlanta Hyundai dealers" a business in Atlanta that's website: has been properly marked up with schema (or microdata for business location) has claimed its Google Local+ has done enough downstream work in Local Search listings for its NAP (name, address, phone number) will no longer have to incorporate variations of "Atlanta Hyundai dealers" in the text on the website? Could they just write enough great content about how they're a Hyundai dealership without the abuse of the Atlanta portion? Or if they're in Boston and they're a dentist or lawyer, could the content be just about the services they provided without so much emphasis tied to location? I'm talking about removing the location of the business from the text in all places other than the schema markup or the contact page on the website. Maybe still keep a main location in the title tags or meta description if it would benefit the customer. I work in an industry where location + keywords has reached such a point of saturation, that it makes the text on the website read very poorly, and I'd like to learn more about alternate methods to keep the text more pure, read better and still achieve the same success when it comes to local search. Also, I haven't seen other sites penalized for all the location stuffing on their websites, which is bizarre because it reads so spammy you can't recognize where the geotargeted keywords end and where the regular text begins. I've been working gradually in this general direction (more emphasis on NAP, researching schema, and vastly improving the content on clients' websites so it's not so heavy with geo-targeted keywords). I also ask because though the niche I work in is still pretty hell-bent on using geo-targeted keywords, whenever I check Analytics, the majority of traffic is branded and geo-targeted keywords make up only a small fraction of traffic. Any thoughts? What are other people doing in this regard?
Algorithm Updates | | EEE30 -
Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc . I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ? Note : My site is a US based site targeting US audience
Algorithm Updates | | Chaits0 -
Redirected old domain to new, how long before seeing the external links under the new domain?
Before contracting SEO services, my client decided to change his established root domain to one more customer-friendly. Since he had no expertise on board, no redirects were set up until 6 months later. I ran stats right before the old domain was redirected and have a report showing that he had roughly 750 external links from 300 root domains. We redirected the old domain to the new domain in mid Jan 2012. Those external links are still not showing in Open Site Explorer for the new domain. I've tested it a dozen times, and the old domain definitely points to the new domain. How long should it take before the new domain picks up those external links? Should I do anything else to help the process along?
Algorithm Updates | | smsinc0 -
New registered domains
Always looking for easier ways to identify new clients needed SEO help. I wondered is it possible to find newly registered domains ? Could an api be made to pull out domains listed via date registered.
Algorithm Updates | | onlinemediadirect0