DA vs Relevancy - Trade Off Question
-
Hey Guys
We all know that relevancy largely trumps DA nowadays.
What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site...
For example, sites with DA of 100 we probably want backlinks from.
So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts...
Obviously you would still be building relevant links too, developing content to do so and all that good stuff. I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links
Thanks
-
I submitted a request through The Guild to find out why my original answer above was not delivered to you. Sorry about that.
As to your question about the average Moz Domain Authority. DA is made up of "MozRank", "MozTrust" and your link profile. Of those three, the one that we can most quantify is "MozRank." Moz actually says the average MozRank (from 1-10) for a site page is a "3."
So just taking that information it's easier to see why the average Moz Authority would be on the lower end 30-40 and not 50 or higher. It also explains why it is easier to move from 30 to 40 than it is to move from 70-80.
Good luck with the above, let me know if I can provide any other help.
-
Weird, I didn't get anything over there, just checked and it's still not on the thread or anywhere I can see...
I posted it on a few forums as I wanted to see what the response was like as a sort of 'test' as to which forum I should spend time on
Anyway, thanks for the response. I was not aware so many sites below 40 existed - I really had no basis point for what was an above average DA (I would naturally have assumed above 50!).
I will check out your articles - thanks again!
-
Hi Michael,
You submitted this same exact question over at The Guild (SEO Chat). Did you receive that answer?
Here's that answer again, including links to the subscriber-only materials you have access to you. Let me know if you can't access anything or if you have any clarification questions:
First, if you haven't seen our article this month on Domain Authority I'd like you to review that first here:
https://guild.seochat.com/se-news/content/looking-to-boost-your-domain-authority-here-are-some-tips
As for your question below, honestly, there is no right answer. Speaking as a link builder myself I'm ALWAYS going to choose a lower DA value site link target that is niche-related over a higher DA site that is not. That's a qualified linking target. Just as I would always choose a link on a page that is actually going to get clicked over a link on a higher DA page that sends me NO traffic.
So there is really not a quantifiable answer to your question. If you want to use DA 70 as your make or break it number, awesome. But considering that the VAST majority of sites have a DA score of 40 or less, personally, 70 seems awfully high to use as a preconceived standard for evaluation.
Make sure to read my detailed article on link evaluation here:
When I'm evaluating a link I'll honestly not review DA till the end. And even then, again, if a link isn't going to get clicked or send traffic, to me that is a HUGE deal and I devalue the link immediately.
Hope that helps. Thanks for the question and definitely respond (other here, or at The Guild) for more information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites vs. one for SEO
I recently met with a new potential client who currently has two websites for his business - one that is for the business as a whole and another that is specific to one of his particular services (his main service and what the overall business is known for). My first question was "why do you have two websites?" His response was that he has had a really hard time ranking well organically for his main service. He worked with an SEO company for two years and never was able to establish a solid organic presence for searches related to his main service - so he went ahead and had a site built to focus specifically on that service with the hope that it would help him rank organically for searches related to that service. The new site was built very recently (Dec. 2014) and it hasn't had a lot of optimization work put into it. The original site has a much higher Domain Authority, more incoming links, etc. My typical preference has always been to use one website and drive all traffic to that site, while building out specific content for any products/services on individual pages of the site. For some reason I'm torn as to what to do with this particular situation since his main concern is ranking for his core service, which hasn't happened with the original site. I'm concerned, though, that optimizing and managing two websites will be less effective than driving all of the traffic to one site, and that it could actually be detrimental overall. What are your thoughts? Suggestions? Feel free to let me know if you need any more details.
Intermediate & Advanced SEO | | garrettkite0 -
Question about optimising an inner pages apposed to the homepage
Hi Everyone, I'm currently looking to optimise the inner page of a website opposed to the homepage itself. I was wondering if I should stick to some kind of link distribution? For instance, say my website is about widgets and the url is http://www.widgets.com, I want to optimise for a much easier "blue widgets" term on an inner page with the url: http://www.widgets.com/blue-widgets. Does google discriminate against a website with a higher number of links pointing to an inner page than the homepage? If so, what would you recommend a safe distribution between the two? Your thoughts would be greatly appreciated, Peter.
Intermediate & Advanced SEO | | RoyalBlueCoffee0 -
Competitor sites vs mine - No links, lower DA, and still beating me.
Thank you for taking the time to read my question. I have a website - berneseoftherockies.com - it is a bernese mountain dog website My competitors are Rockymountainpuppies(dot)com and Coloradobernesemountaindog(dot)com When using the Moz tools, I see they have no incoming links, except for one site has 5 links from its own pages. But when I type in Bernese Mountain Dogs Colorado - I am no where to be found, except for a you tube video. So what am I doing so wrong? They are basically doing nothing, and killing me in the serps. I have gotten social media stuff like Google +, facebook, twitter, pinterest, and youtube. They are still behind the times. So any thoughtful advice is appreciated. I mainly cater to the state of Colorado where I live. So just curious if there is something at the top of your head that you may think of that's causing my issues? Like could it be my hosting? Like can you have a black listed host? I am with Hostdime I did have a few, like 10 foreign backlinks, which I did remove or disavow I think its called. I have used the title tag tools here to get proper size title tags, and decent keyword density. I built the site for people first, then Google etc. So not sure if you are allowed to tell me, but maybe you can advise me on a decent seo company, or maybe give me a couple tips that may help me out. Please no - read the moz book, I am reading it and trying to do what I am reading. But maybe something simple is keeping me from showing up, while these other sites are. Thank you so much for any advice.
Intermediate & Advanced SEO | | Berner0 -
Extensions Vs Non Extensions
Hello, I'm a big fan of clean urls. However i'm curious as to what you guys do, to remove them in a friendly way which doesn't cause confusion. Standard URLS
Intermediate & Advanced SEO | | Whittie
http://www.example.com/example1.html
http://www.example.com/example2.html
http://www.example.com/example3.html
http://www.example.com/example4.php
http://www.example.com/example5.php What looks better (in my eyes)
http://www.example.com/example1/
http://www.example.com/example2/
http://www.example.com/example3/
http://www.example.com/example4/
http://www.example.com/example5/ Do you keep extensions throughout your website, avoiding any sort of confusion and page duplication; OR Put a canonical link pointing to the extension-less version of each page, with the anticipation of this version indexing into Google and other Search Engines. OR 301 Each page which has an extension to an extension-less version, and remove all linking to ".html" site wide causing errors within software like Dreamweaver, but working properly. OR Another way? Please emphasise I'm sorry if this is a little vague and I appreciate any angles on this, I quite like clean url's but unsure a hassle-less way to create it. Thanks for any advice in advance0 -
Technical Site Questions
When i do a google cache of our site, i see 2 menus, our developers say that's because the 2nd is for the mobile menu - is that correct, as when i look up other sites that have mobile rendering they only have one menu visible. Plus GWT's has the number of internal links per page at least x2 what they should have - are they connected? Secondly when i do a spider test through http://tools.seobook.com/general/spider-test/ it shows all "behind the scenes text" eg font names, portals, sliders, margins - "font size px" is shown as 17 times and a density of 2.15% - surely this isnt correct as google will be thinking that these are my keywords !? My site is www.over50choices.co.uk Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
We currently have all content on one URL and use # and Javascript refresh to paginate pages, and we are wondering if we transition to the Google's recommended pagination if we will see an improvement in traffic. Has anyone gone though a similar transition? What was the result? Did you see an improvement in traffic?
Intermediate & Advanced SEO | | nicole.healthline0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0