Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain authority and keyword difficulty
-
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool?
I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
-
I have a new domain and I am beating pages with much higher page ranks / DA etc
Typically in the niche I am targeting if I use the Keyword Difficulty tool i get an average of about 50% - I think this tool mainly uses DA / PA to work out the difficulty %.
After I will do a Google search and look on the pages for mentions of the keyword's I am going to target with my pages / posts.
Often I find there are few mentions or 1 exact match mention in the content with the page title being something different / not exact match.
I will then build a page which is targeted specifically for the keyword and optimise for it, I don't over do the optimisation - if the other pages only have 2 mentions of the keyword in the content I would normally build a post with say 3 - 5 mentions. I have noticed when going over 5 keyword (aprox) pages tend to rank poorly or rank badly then crawl back up the SERP's slower - this could be due to the domain on the site being 1 month old.
I also only build quality content that is relevant to the search term, this should prevent the pages dropping from the SERP's (I hope!).
Obviously if your niche has highly optimised pages and a bunch of links pointing at each page then this method is not going to work.
Hope that helps.
-
I do this a lot (on a daily bases), so first off your not alone.
And your right in the fact it does need to be weighed up.
It's very hard to give an actual answer but in general if I see PA/DA 40+ a lot of work could be involved, if I see PA/DA 20- should be easy, id expect first page rankings in a few weeks to a month if that.
I don't just go off this alone but it's my starting point, I will check out all of page 1 and page 2 and some times page 3. You might find page 1 is 40+ then page 2 is 25+.
But I do look at lots of elements for example how much there content is shared socially, this gives me an idea if I produced the same sort of content and pushed PPC to it what kind of sharing potential is available from this audience (but this is just a method I do, haven't seen anyone else doing it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much domain authority is passed on through a link from a page with low authority?
Hello, Let's say that there is a link to site A from site B. The domain authority of site B is 85, but the link is on a page that has a page authority of only 1. Does much authority get passed along from site B to site A? (Let's assume site A has a domain authority of 35, if that's relevant.) Thank you!
Technical SEO | | nyc-seo0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
How much authority does a 301 pass to a different domain?
Hi, A client of mine is selling his business to a brand new company. The brand new company will be using a brand new domain (no way to avoid that unfortunately) and the current domain (which has tons of authority, links, shares, tweets, etc.) will not be used. Added to that, the new company will be taking over all the current content with just a few minor changes. (I know, I wish we could use the old domain but we can't.) Obviously, I am redirecting all pages on the current domain to the new domain via 301 redirects on a page by page basis. So, current.com/product-page-x.html redirects to new.com/product-page-x.html. My client and the new company both are asking me how much link juice (and other factors) are passed along to the new domain from the old domain. All I can find is "not the full value" or variants thereof.My experience with 301 redirects in the past has been within a single domain and I've seen some of those pages have decent authority and decent rankings as a result of the 301 (no other optimization work was done or links were added). Are there any studies out there that I'm missing that show how much authority/juice gets passed and/or lost via a 301 redirect? Anybody with a similar issue see any trends in page/domain authority and/or rankings? Thanks for any insights and opinions you have.
Technical SEO | | Matthew_Edgar0 -
Delete old site but redirect domain to a new domain and site
I just have a quick query and I have a feeling about what the answer is so just wanted to see what you guys thought... Basically I am working on a client site. This client has a few other websites that are divisions of their company. However these divisions/websites are no longer used. They are wanting to delete the websites but redirect the domains to their name main website. They believe this will pass on SEO benefits as these old division sites are old and have a good PR and history. I'm unsure for DEFINITE, which way is correct?
Technical SEO | | Weerdboil0 -
Outranking a competitor when their domain name is the keyword
Hi I'd just like to ask the opinion of my fellow members here : We are currently ranking second for a very important keyword and would obviously like the top spot on the SERP - the site that is ranking first has the domain name as the keyword phrase(along with a good amount of quality links from a variety of domains) - now I know it is possible to outrank them since I do remember reading about this in one of Rands posts(I think it was the whole white hat black hat one he posted recently) - bascially we have more domain authority, slightly less links but from double the amount of root domains and a higher page authority too! Does having the keyword as your domain make THAT much of a difference when we are(imo) quite close in terms of great content and link profiles(and all the onpage factors) ? Thanks!
Technical SEO | | DanHill0