Do EMD's give the boost everyone says they do?
-
Hi,
I have used a few myself and if I was targeting UK search with a [emd] .co.uk, every time the domain has hit page 1 with little effort.
I have done this maybe 4-5 times, my moz stats show 0 but I rank above results on page 1 with moz stats of DA:45+.
Can I now say basically any EMD I buy will rocket through the serp's?
-
Yes I agree they work great. But this topic has been openly discussed way too many times..including directly with Matt Cutts at Pubcon. In short, the effects of EMD have changed a lot from 2 years ago...and it will continue to decrease. It's not that they will get penalized (although in a way they will)..it would basically mean that the "magical" effect will go away but they will still have an advantage in my opinion. Here's how:
Let's say your website is abc.com and keyword is Blue Widgets. If somebody links to you, the link code will be:
Blue Widgets vs if your domain was BlueWidgets.com the link would be: Blue Widgets or Click Here
You see what I am talking about. That advantage would always be there. Having your keywords in every incoming link, even if it's not an anchor text link.
-
When I say 0/1 that's because within a month the EMD is usually already on page 1, with no link profile so the moz stats are accurate.
Yeah I see the same happening soon as well so your right and it's already in my mind to build up the domains profile while they are sitting in positions that allow me to do so, social shares etc.
-
If your Moz stats show 0/1 then it's very likly the page/domain has not been crawled by the Moz ap. It usualy takes 1-2 months to see acurate data for a new domain in terms of PA/DA.
EMD's are VERY powerful. In the industry I work in there is almost always a EMD on page 1 positions 4-6 that has a link profile thats 100th of the power of the sites it is beating, and usualy with similar optimisation otherwise.
I expect to see it being one of the things google stops taking into account sooner rather then later though due to this. So I'd use your inflated position to build a stronger natural link profile for such a time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does product environment have impact on main website's SEO
We have two environments - product, where login is necessary and where the customers are working. We also have there our help desk, Q&A and knowledge base. Pretty sophisticated page regarding information on a specific topic. We also have our main page where we promote our products, company and events, etc. Main page is www.example.com, where product environment is login.example.com . Does this product environment have an impact on my main page's SEO?
Intermediate & Advanced SEO | | NeringaA0 -
What's more valuable, a Blog or a Forum, and how to integrate?
We want to start a blog or forum (maybe eventually both) and are unsure what is the best way to publish it from an SEO standpoint. If the blog is published on our domain, like domain.com/blog then that obviously helps the site but if the base site is a for-profit business wouldn't it get less credibility, eyeballs, links as opposed to if you started the blog as it's own separate community on a separate domain and then just strategically linked to the for profit site (sponsorship links)? Essentially the question is, if I'm the Lucky Soday Company, do I start a Blog on the Lucky Soda website, or do I start a separate website to grow a soft drink enthusiast community blog / forum? I would guess a blog has more SEO potential than a discussion forum?
Intermediate & Advanced SEO | | MrSem0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
Competitior 'scraped' entire site - pretty much - what to do?
I just discovered a competitor in the insurance lead generation space has completely copied my client's site's architecture, page names, titles, even the form, tweaking a word or two here or there to prevent 100% 'scraping'. We put a lot of time into the site, only to have everything 'stolen'. What can we do about this? My client is very upset. I looked into filing a 'scraper' report through Google but the slight modifications to content technically don't make it a 'scraped' site. Please advise to what course of action we can take, if any. Thanks,
Intermediate & Advanced SEO | | seagreen
Greg0 -
Yoast meta description in ' ' instead of " " problem
Hi Guys this is really strange, i am using yoast seo for wordpress on two sites. On both sites i am seeing meta name='description' instead of meta name="description" And this is why google is probably not reading it correctly, on many other link submission sites which read your meta data automatically say site blocked. How to i fix this? Thanks
Intermediate & Advanced SEO | | SamBuck0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0