Ranking for competitive keywords
-
Hi Folks,
I am relatively new to SEO and I was hoping folks here could give me some guidance/tips on ranking in a competitive keyword space. My client is a health care provider and they wish to rank for terms like 'heart attack' which I believe will be quite difficult due to it being a short tail keyword and it is a very competitive space. Any an all advice and input is greatly appreciated.
Regards,
Dave
-
Ciaran,
Starting from scratch to rank for a term like "heart attack" would be quite an endeavor for a seasoned SEO with a client who had a fairly substantial budget, let alone for someone new to the discipline. Not that it can't be done, but knowing enough to inform the client what it will take and what might be more productive options for their practice is part of being and SEO. It sounds like you're working on coming to grips with that knowledge at this very point.
Starting out, I'd recommend a couple of things: Do an advanced Open Site Explorer report on your client's site and then do a comparative OSE report against some of the other sites that rank on page one for that term. Then do another report comparing your client with other sites even further down in the results to get a sense of where the client might be starting from and what they would need to do to get to the top. Installing the Mozbar can be helpful too.
Moz has another tool that you might find handy--it the keyword difficulty tool and be sure that you've read through their How To Do Keyword Research guide, as well.
-
So your goal is:
to rank on high competitive short-term keywords in a stable position and over a longer periode, right?
There are 3 possibilities (which you won`t like):
1. your client has a fortune to spend for SEO and SEA
2. you can use black hat techniques
3. your client has patience (a lot of)
The last one is the most realistic and MAYBE (there is no guarantee!) the most effective way. You have to do a lot of on-page and off-page work to establish the authority of a new domain for a few of keywords. You have to prove that your site has the relevance to be an authority for e.g. "heart attack" - this process takes some time.
We needed several years to make clear that our website is the authority for the keyword "guitar" (the german word for it) within Google Germany - the result is a stable and constant high position for years as well (no.1)... but the list to (maybe) be successful with that is long...
I would always recommend to optimize a long-term keyword which is not so competitive. If you are successful to establish to be an authority for such keywords will help you for your next step: to become an authority fpr high competitive keywords ... try to make your customer clear that you can bring them on top very fast but that they will also fall down much more faster then!
It`s just like buidling up a good reputation or to establish a good credit history - but I believe you already know that!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Core Web Vitals hit Mobile Rankings
Hey all, Ever since Google announced "Core Web Vitals" are mobile rankings have nose-dived. At first, I thought it was optimisation changes to the page titles we had made which might still be part of the issue. However, Desktop rankings actuallyy increased for the same pages where mobile decreased. There is the plan to introduce a new ranking signal into the Google algorithm called the "core web vitals: and this was discussed around late May. even though it's supposed to get fully indexed into a ranking signal later this year or early next; I think Google continuously test and release this items before any official release. If you weren't aware, there is a section in Google Webmaster Tools related to "core web visits", which looks at:1. Loading2. Interactivity3. Visual StabilityThis overlays some of the other basic requirements of a good website and mobile experience. Taking a look at our Google Search Console, it appears to be the following:1. Mobile- 1,006 poor URLs, 100URLs need improvement and 475 good URLs.2. desktop- 0 poor URLs, 379 need improvements and 1,200 good URLsSOURCE: https://search.google.com/search-console/core-web-vitals?resource_id=https%3A%2F%2Fwww.griffith.ie%2FIn the report, we can see two distinct issues with the mobile pages:CLS Issue: more than 0.25 (mobile)- 1,006 casesLCP issue: longer than 4secs (mobile) - 348 case_CLS (Cumulative Layout Shift)This is a developer issue, and needs fixing. It's basically when a mobile screen jumps for the user. It is explained in this article: https://web.dev/cls/Seems to be an issue with all pages. **LCP (Largest Contentful Paint)_**Again, another developer fix that needs to be implemented. It's connected to page speed, and can be viewed here: https://web.dev/lcp/Looking at GCS, it looks like the blog content is mostly to blame.It's worth fixing these issues and again looking at the other items on page speed score tests:1. Leverage browser caching- https://gtmetrix.com/reports/griffith.ie/rBtvUC0F2. https://developers.google.com/speed/pagespeed/insights/?url=griffith.ie- mobile score for home page is 16/100, https://www.griffith.ie/people/thamil-venthan-ananthavinayagan is 15/100I think here is the biggest indicator of the issue at hand. Has anybody else noticed their mobile rankings go down and desktop stay the same of increase.Kind regards,
Web Design | | robhough909
Rob0 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Yoast focus keywords for portfolio post types in WordPress
This one is for the WordPress optimization crowd! Portfolios are used to display work. I have a question about best seo optimization techniques. 1. Portfolios can be used to display many different types of work, for me it either original web designs from scratch, WordPress redesigns, or importing a current website into WordPress. What is the best practice for keywords for multiple portfolios that are in one category. for instance WordPress Redesign. If I have 5 WordPress redesign portfolio posts is it good practice to use WordPress redesign for all 5 portfolios or should they use variations? Yoast gets angry when the same focus keyword is used multiple times. 2. Should portfolios even be indexed? Since this is how I attract new customers I would think yes but am I giving too much exposure to my client and not enough to my business. I guess this will depend on titles and meta descriptions. A discussion on best practices here is what I am really looking for. What is your advice and opinion on the matter.
Web Design | | donsilvernail1 -
Does a Website Overlay Hurt Ranking?
We are looking into personalizing visits on our website. Rather than to write Tb of code on each page, we want to use a little line of Java script on the page, that would allow to place overlay on the site, depending on the visitor. Example: Change the home page picture when a returning visitor comes back on the website and show him/her a different picture bases upon the pages he/she visited prior. Would that overlay be caught by Google and would G have a problem with that? We all know about cloaking, but G does not penalize for heat mapping overlays, for example. Or not to my knowledge they don't. Thanks all for your personal insights! Peter
Web Design | | Discountvc0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Site down for more than a month - lost rankings
Hello, We have run into a situtation where we had multiple pages setup for different keywords but didn't realize that we had a name server issue that has caused the pages to be down for the last month or so (2-3 weeks on the low side.) The rank finder was still working fine, but the offline page was never reported. We realized the situation recently and have since gotten the sites back online under the new nameservers. Most of these sites were ranking 1 and 2 spots in their keywords, and now are no where to be found in the Google Index. Should I do anything differently, or just put the sites back online and wait it out? I have seen in different places that it may only take 2 weeks to come back, but it's possible that Google has marked the sites as 'not quality' because of their downtime and it will be even harder to get them to rank again. Can anyone shed any light on this situation? Any information is appreciated. Thanks in advance.
Web Design | | EQ-Richie0 -
Broad Phrase keywords as domain name
Hello, Am new here, can you advise if its good idea to buy broad phrase keywords as domain name e.g whatisagoodwebsite.com? Thank you
Web Design | | seoatbest0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0