Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain authority and keyword difficulty
-
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool?
I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
-
I have a new domain and I am beating pages with much higher page ranks / DA etc
Typically in the niche I am targeting if I use the Keyword Difficulty tool i get an average of about 50% - I think this tool mainly uses DA / PA to work out the difficulty %.
After I will do a Google search and look on the pages for mentions of the keyword's I am going to target with my pages / posts.
Often I find there are few mentions or 1 exact match mention in the content with the page title being something different / not exact match.
I will then build a page which is targeted specifically for the keyword and optimise for it, I don't over do the optimisation - if the other pages only have 2 mentions of the keyword in the content I would normally build a post with say 3 - 5 mentions. I have noticed when going over 5 keyword (aprox) pages tend to rank poorly or rank badly then crawl back up the SERP's slower - this could be due to the domain on the site being 1 month old.
I also only build quality content that is relevant to the search term, this should prevent the pages dropping from the SERP's (I hope!).
Obviously if your niche has highly optimised pages and a bunch of links pointing at each page then this method is not going to work.
Hope that helps.
-
I do this a lot (on a daily bases), so first off your not alone.
And your right in the fact it does need to be weighed up.
It's very hard to give an actual answer but in general if I see PA/DA 40+ a lot of work could be involved, if I see PA/DA 20- should be easy, id expect first page rankings in a few weeks to a month if that.
I don't just go off this alone but it's my starting point, I will check out all of page 1 and page 2 and some times page 3. You might find page 1 is 40+ then page 2 is 25+.
But I do look at lots of elements for example how much there content is shared socially, this gives me an idea if I produced the same sort of content and pushed PPC to it what kind of sharing potential is available from this audience (but this is just a method I do, haven't seen anyone else doing it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Clients domain expired - rankings lost - repurchased domain - what next?
Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?
Technical SEO | | waqid0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Domains
My questions is what to do with old domains we own from a past business. Is it advantages to direct them to the new domain/company or is that going to cause a problem for the new company. They are not in the same industry.
Technical SEO | | KeylimeSocial0 -
How much authority does a 301 pass to a different domain?
Hi, A client of mine is selling his business to a brand new company. The brand new company will be using a brand new domain (no way to avoid that unfortunately) and the current domain (which has tons of authority, links, shares, tweets, etc.) will not be used. Added to that, the new company will be taking over all the current content with just a few minor changes. (I know, I wish we could use the old domain but we can't.) Obviously, I am redirecting all pages on the current domain to the new domain via 301 redirects on a page by page basis. So, current.com/product-page-x.html redirects to new.com/product-page-x.html. My client and the new company both are asking me how much link juice (and other factors) are passed along to the new domain from the old domain. All I can find is "not the full value" or variants thereof.My experience with 301 redirects in the past has been within a single domain and I've seen some of those pages have decent authority and decent rankings as a result of the 301 (no other optimization work was done or links were added). Are there any studies out there that I'm missing that show how much authority/juice gets passed and/or lost via a 301 redirect? Anybody with a similar issue see any trends in page/domain authority and/or rankings? Thanks for any insights and opinions you have.
Technical SEO | | Matthew_Edgar0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0