Are you still seeing success with EMD's?
-
I am curious if any other SEO's are still seeing success with exact matching domains.
I am not seeing ANY changes to any of my clients rankings since the "Exact Match Domain" filter came about in September.
Also while I have conducted SERP audits in my neck of the woods I am noticing EMD's are still doing very well.
What are you seeing?
-
One of my EMD's is outranking my main site for a low competitive keyword without much of a link profile at all. It's a very simple site only one page, but has very unique content. It was more of a site to just play with, but for not much work it ranks first for the phrase over about 10 other sites that appear to be optimizing to the keyword.
-
Same - EMDs that have real content and a real site are doing fine. EMD "penalty" if you want to call it that, affected more the people who are just putting up simple microsites or filler/fluff sites on emds.
The biggest thing I've seen affect EMD is domain age. Domains over a year or two old seem unaffected. New EMDs with under a year left on their registration have been slammed hard. EMDs with thin content are definitely under the gun. But anything "real" seems unaffected or only minor hits here & there.
-
From what I have seen, Google turned down the EMD benefit in February 2011.
Spammy EMDs were tweaked back a couple months ago (let's say those on the border of a Panda problem or a Penguin problem).
But if you have an EMD with good content the domain still gives you some advantage.
When I use "EMD" I am referreing to domains like DigitalCameras.com and not to domains like SamsDigitalCameras.com for the "digital cameras" querry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible (or advisable) to try to rank for a keyword that is 'split' across subfolders in your url?
For example, say your keyword was 'funny hats' - ideally you'd make your url 'website.com/funny-hats/' But what if 'hats' is already a larger category in your site that you want to rank for as its own keyword? Could you then try to rank for 'funny hats' using the url 'website.com/hats/funny/' ? Basically what I'm asking is, would it be harmful to the chances of ranking for your primary keyword if it's split across the url like this, and not necessarily in the correct order?
Algorithm Updates | | rwat0 -
Its the 21st April, and my non responsive page is still ranking the same ?
Hi, As you know the new algorithm is due today, can anybody confirm why my site wouldn't appear to be affected as yet? Cheers
Algorithm Updates | | CFCU0 -
Crosslinking & Managing Multiple Domains in Same Webmaster Tool's Account
I am wondering if there are any consequences if you manage multiple websites in the same Webmaster Tool's account and cross link between them? My guess is that this would be a very easy thing for Google to detect and build into their algorithms. Hence affect the link juice from those domains that are owned by the same person. I am looking for verification on this. Thanks, Joe
Algorithm Updates | | csamsojo0 -
EMD and partial match
Hi guys, let's say we are looking to become the number one authority in widgets. We therefore want to acquire the domain widgets.com for rebranding and SEO purposes. I know the upside of this in terms of SEO is getting smaller, but it's still there. So we get a small boost when people search for "widgets", but would a domain like this also contribute to ranking for terms like "blue widgets" and "brandname widgets"? If so, would somethings like widgets.com/blue-widget.html be optimal? Thanks! 🙂
Algorithm Updates | | HDPHNS0 -
Google's not indexing my blog posts anymore! Why?
Google just recently stopped indexing my blog posts immediately after being published, why could this be? I would usually post a blog post and it would be in google results within 45 seconds, now they don't show up until 6 hours later, if at all (a few never even showed up). Also, my home page doesn't even refresh when I make a change to the site. My site is CantStopHipHop [dot] comI have all in one SEO, xml sitemap generator, and webmaster tools and nothing seemed irregular in the settings.I appreciate any thoughts/help/suggestions.
Algorithm Updates | | bb2550 -
How will SEO be impacted by Google's new Knowledge Graph?
With the recent announcement of Googles new Knowledge Graph, the SERP will be different. Will this present a new set of SEO best practices?
Algorithm Updates | | PerriCline0 -
Why some results in SERP have a www. and some don't
Hello all, If this is posted twice, I didn't mean for it to be - but it looks like last time I tried to post this question it didn't post. This is my question: How come some results on Google's SERP page are shown with a "www" and some are not? Does this effect SEO at all? I am including a screen shot so you can see what I mean. The Geary Interactive result has a "www" in front of while ingenexdigital doesn't. R6GLL.png
Algorithm Updates | | digitalops0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0