Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Beating a keyword Domain
-
Has anyone here managed to beat a keyword/exact match domain to top spot?
I am currently second and wondering if it is worth the time and effort to knock it off the top spot. How hard is it to get these very annoyingly favoured domains off 1st?
Any help and advice much appreciated.
-
Keyword domains used to be a lot more powerful than they are right now. Based upon watching a lot of their rankings, I believe that Google turned down their value in early 2011 - shortly after Matt Cutts said that their value would likely be turned down. Maybe they will do that again, I don't know.
I agree that some of these sites rank well with very little content and poor user experience. That generally occurs where competition is rather low in Google or where it is low to moderate in Bing. Where competition is high a domain only makes a fractional contribution to the rankings. So when you see them ranking well for highly competitive terms they are doing the same type of SEO as any other site with poor onsite assets.
-
Thankyou for the reply.
Forgive me if i am a little unsympathetic but time after time i see keyword domains ranking for huge terms but the site itself offers very little in terms of quality content and user experience.
Yes it is annoying to many online search marketers who try very hard and provide natural quality content to be out ranked by a keyword domain because a particular search engine has a bias towards exact match.
No offence but in my opinion the sooner we see keyword domains getting treated the same as any - the sooner we will see a better quality of results.
Forgive me if you are one of the people who produce quality sites on your keyword domain, i respect you if you do.
-
I own several keyword domains. Some of them have top rankings for their exact match query and some of them don't. There is no special formula for beating them. Just compete against them as you would any other domain.
If you ask me the bigger problem is google giving easy top rankings to weak content on ehow, about, wikipedia and other powerful sites.
How hard is it to get these very annoyingly favoured domains off 1st?
This really seems to bother you. But if you turn that around you would consider them to be a huge asset. So, maybe you should just go out and buy one. Find the guy who owns one and ask "what will it take for you to sell it to me?"... or hire a pro to do that for you. I've done it a few times and am happy with most of the results. They seem to produce a higher conversion rate too.
-
It's one of those domains setup for that keyword only by the looks of it.
The keyword is pretty much their brand term.
-
It really depends on how competitive the keyword is and how strong your competitor is aside from just the exact match aspect. Of course it's possible to beat an exact match domain, and these will probably hold less and less value by Google in the future.
Overall, I would say that fact that it's an exact match domain doesn't change much of anything for me in terms of if it can be outranked. Looking at all the factors holistically is much more important.
-
Social Media presence has been actively influencing the SERPS. I have seen the impact both on local as well as global search. Even if the links from those websites are nofollowed or are not links at all, just URL mentions (or citations) they help. So I would just analyze their backlink profile and get better links. Also, are you outranking them for other keywords, keyword variants and other long tail keywords which are essentially beyond their exact match domain? How is your Domain Strength and Page Strength vs them ?
-
Thankyou for the reply.
One of our big competitors has a huge youtube presence. I alwayus wondered how much of an affect having an active youtube channel can have. What do you think to that?
Thankyou for your advice.
-
Just because the top site is an exact match doesn't mean you can't dislodge them from their #1 spot.
Without knowing all that you have done for your site, the best advice I can give is to look at what they are doing, and then think about other ways to beat them. For the sake of this discussion, let's say that they have a great youtube channel and are very active on twitter. First, you will have to make sure you have a presence on those two sites as well.
Then, you need to take it to the next level - make sure you have a Pinterest, LinkedIn, FB, StumbleUpon, reddit and delicious accounts. Make sure you have the google +1 button on your site and a Google plus page where you link back to your site from.
Simply put, you will have to do more than them in order to take the top spot.
In regards to if it's worth it or not, only you can determine that.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Do Meta Keywords matter?
I am a firm believer in the fundamentals of SEO but is there any data to support its impact positively or negatively towards a sites rank?
Technical SEO | | Brandonp0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Stop Words and keyword optimization
Ok, so I understand Google doesn't use stopwords (like "a" or "the"). Therefore if I am optimize for a keyword phrase, and say find an opportunity for say: "how to create stuff something" But it actually reads better as (although doesn't sound completely out of place as above) "how to create stuff in something" Which is better for SEO? (ignore usability \ readability in your replies please and assume it reads reasonably either way as that was just an example)
Technical SEO | | TheWebMastercom0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Keyword not showing
Hi, we are trying to rank this keyword "Human Resource Books" for Silvercreek.ca for a long time. But somehow, the keyword is not ranked by google at all. Is there a reason why Google is denying our site? What did we do wrong? Can anyone help to see what wrong with tis siet www.silvercreekpress.ca? thanks
Technical SEO | | solution.advisor0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0