DA vs Relevancy - Trade Off Question
-
Hey Guys
We all know that relevancy largely trumps DA nowadays.
What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site...
For example, sites with DA of 100 we probably want backlinks from.
So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts...
Obviously you would still be building relevant links too, developing content to do so and all that good stuff. I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links
Thanks
-
I submitted a request through The Guild to find out why my original answer above was not delivered to you. Sorry about that.
As to your question about the average Moz Domain Authority. DA is made up of "MozRank", "MozTrust" and your link profile. Of those three, the one that we can most quantify is "MozRank." Moz actually says the average MozRank (from 1-10) for a site page is a "3."
So just taking that information it's easier to see why the average Moz Authority would be on the lower end 30-40 and not 50 or higher. It also explains why it is easier to move from 30 to 40 than it is to move from 70-80.
Good luck with the above, let me know if I can provide any other help.
-
Weird, I didn't get anything over there, just checked and it's still not on the thread or anywhere I can see...
I posted it on a few forums as I wanted to see what the response was like as a sort of 'test' as to which forum I should spend time on
Anyway, thanks for the response. I was not aware so many sites below 40 existed - I really had no basis point for what was an above average DA (I would naturally have assumed above 50!).
I will check out your articles - thanks again!
-
Hi Michael,
You submitted this same exact question over at The Guild (SEO Chat). Did you receive that answer?
Here's that answer again, including links to the subscriber-only materials you have access to you. Let me know if you can't access anything or if you have any clarification questions:
First, if you haven't seen our article this month on Domain Authority I'd like you to review that first here:
https://guild.seochat.com/se-news/content/looking-to-boost-your-domain-authority-here-are-some-tips
As for your question below, honestly, there is no right answer. Speaking as a link builder myself I'm ALWAYS going to choose a lower DA value site link target that is niche-related over a higher DA site that is not. That's a qualified linking target. Just as I would always choose a link on a page that is actually going to get clicked over a link on a higher DA page that sends me NO traffic.
So there is really not a quantifiable answer to your question. If you want to use DA 70 as your make or break it number, awesome. But considering that the VAST majority of sites have a DA score of 40 or less, personally, 70 seems awfully high to use as a preconceived standard for evaluation.
Make sure to read my detailed article on link evaluation here:
When I'm evaluating a link I'll honestly not review DA till the end. And even then, again, if a link isn't going to get clicked or send traffic, to me that is a HUGE deal and I devalue the link immediately.
Hope that helps. Thanks for the question and definitely respond (other here, or at The Guild) for more information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Company Rebranded - Domain/Page Authority Question
Our company rebranded, our old domain has pretty good PA and DA. Any way to utilize our old domain to boost PA/DA of our new domain? PS - Both are hosted on the same host (same IP).
Intermediate & Advanced SEO | | idlwebinc0 -
New Domain VS New Page Backlink?
Assuming you've already got a link from:
Intermediate & Advanced SEO | | Sam.at.Moz
sitea.com/page1 (Moz domain rank 55, Moz page rank 30) You have two choices for another link: 1. Another link on the same domain but a new page:
sitea.com/page2 (Moz domain rank 55, Moz page rank 30) 2. A link on a new domain but with a lesser domain & page rank
siteb.com/page1 (Moz domain rank 30, Moz page rank 20) Assuming you have no other links to your site - both sites are relevant to your industry, both 5 years old, both have the same number of visitors/external links/ads and the content and anchor text remains the same. Which will have a bigger impact on SERP movements? Sam0 -
Hreflang in vs. sitemap?
Hi all, I decided to identify alternate language pages of my site via sitemap to save our development team some time. I also like the idea of having leaner markup. However, my site has many alternate language and country page variations, so after creating a sitemap that includes mostly tier 1 and tier 2 level URLs, i now have a sitemap file that's 17mb. I did a couple google searches to see is sitemap file size can ever be an issue and found a discussion or two that suggested keeping the size small and a really old article that recommended keeping it < 10mb. Does the sitemap file size matter? GWT has verified the sitemap and appears to be indexing the URLs fine. Are there any particular benefits to specifying alternate versions of a URL in vs. sitemap? Thanks, -Eugene
Intermediate & Advanced SEO | | eugene_bgb0 -
Duplicate Content: Organic vs Local SEO
Does Google treat them differently? I found something interesting just now and decided to post it up http://www.daviddischler.com/is-duplicate-content-treated-differently-when-local-seo-comes-into-play/
Intermediate & Advanced SEO | | daviddischler0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Advanced Question on Synonym Variation Pages!
Hi, This is quite an advanced question, so I'll go through in detail - please bare with me! I launched the new version of our website exactly a week ago - and all the key metrics are in the right direction: Pages / Visit +5% , Time on Site +25%, Bounce rate down 1 %. I work in an industry were our primary keyword has 4 synonyms and our long tail keywords are location related. So as an example I have primary synonyms like: Holiday, Vacation, Break, Trip (Not actually these but they are good enough as an example). Pluralised versions and you have 8 in total. So my longtail keywords are like: Las Vegas Vacation / Las Vegas Vacations
Intermediate & Advanced SEO | | James77
Las Vegas Holiday / Las Vegas Holidays
Las Vegas Trip / Las Vegas Trips
Las Vegas Breaks / Las vegas Breaks All these synonyms effectively mean the same thing, so my thinking on my new website was to specifically target each of these synonyms with their own unique page and optimise the meta and page titles, to those exact words. To make these pages truely unique, I therefore got a bunch of copywriters to write about 600 words unique for every long tail synonym (well over 750,000 words in total!). So now at this point I have my page "Las Vegas Holidays" with 600 unique words of content, and "Las Vegas Vactions" with 600 words of unique content etc etc etc. The problem is, when the user is searching for these words, there primary goal is not to read 600 words of content on "Las Vegas Holidays" - their primary goal is to get a list of last vegas holidays that they can search, view purchase (they may want to read 600 words of content, but is not their primary goal). So this puts me in a dilema - I need to display the nuts and bolt (IE the actual holidays in las vegas) to the customer on any page they land on off my synonyms as the primary content. But to make sure these pages are unique I need to also have this unique content on that page. So here's what I did: On every synonym version of the page I display the exact same information. However, on each page I have a "Information" link. and on click this pop's up a layer which contains my unique content for that page. To further optimise using perfect anchors in this content pop-up, I have cross linked the synonym pages (totally naturally) - IE on my "Las Vegas Holidays" page, in the content I may have the words "Las Vegas Breaks" - this would be linked the the "Las Vegas Breaks" synonym page. In theory I don't think there is anything wrong with what I am doing in the eyes of the customer - but I have a big concern that this may well look "fishy" to SE's. IE the pages are almost identical to the user except for this information pop-up layer of unique content, titles and meta. We know that Google at least can get can tell exactly what the user see's when they land on that page ( from their "Preview") and can distinguise between user visible and hidden text. Therefore, even though from a user experience, I think we are making a page that is perfect for them (they get the list of vactions etc as the primary content, and can read infomation if they want by clicking a button), I am concerned that SE's are going to say - hold on a minute there are load of pages here that are identical except for a chuck of text that is not visible to the user (Even though this is visible to the user if they click the "Information" button), and this content cross links to a load of almost identical pages with the same thing. Today I checked our rankings, and we have taken a fair whack from google - I'm not overly concerned at the moment as I expected big fluctuations from ranking for the first few weeks - but I'd be a lot more confident if they were fluctuating in the right direction!! So what do I do?
As far as I can see my options break down as follows: Content Display:
1/. Keep it as it is, and hope the SE's don't see it as spammy. Even though I think what we are doing is the best for customer experience, I'm concerned SE's won't. 2/. On every synonym page, below all the list of products, packages etc that the customer wants to see, display the unique content as a block of subtext text which is visble by default. This however could make the page a bit ugly. 3/. Display a visible snippet of the unique content, below all the packages, and have a more button which expands the rest of the content - IE have a part visible layer. This is slightly better for display, but again I'm only displaying a portion of visible content and the rest will still be flagged as "hidden" by default to the SE's. Cross Linking within the content:
1/. Keep it as it is where synonym keywords link to the synonym version of the page. 2/. Alter it so that every sysnonym keyword links to the "primary" synonym version of the page - EG if I now "Las Vegas Holidays" is my main keyword, then "Las Vegas Vactions" keyword, would not link to my "Las Vegas Vactions" page as current, but would link to my "Las Vegas Holidays" page. I apologise for the indepth questions, but it requires a lot of explanation to get it across clearly. I would be grateful on any of your thoughts. Many thanks in advance.0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
Intermediate & Advanced SEO | | grayloon0