What are the lowest acceptable metrics for a link?
-
I understand there is a subjective, human factor when deciding to link to/from a site.
Nonetheless, what are the lowest Moz or Majestic metrics that are acceptable when building links. At what point do you say this site doesn't have the profile I would want?
I am looking to clean up the backlink profile of a site. Also, I would like to set a criteria for building links in the future.
I appreciate your thoughts on metrics when it comes to link building.
-
Like others have said, DA is just a number and in today's Googles eyes, it's all about relevancy of the page, rather than the page authority.
Also a site might have a high DA, but if the link juice doesn't pass down to the page where the link is, DA is irrelevant.
how about looking at links in a different way. Instead why not image Google doesn't exist and the only way to get relevant traffic to your site is from other websites talking and sending referral traffic (through a link). You would then only place links where you know you would get decent traffic and sales - think like this and you will get the relevant links and rank nicely. DA and numbers are irreverent then, it's all about relevancy
-
While I understand you are looking for the lowest metrics that are acceptable, I think that is the wrong way to go about it. The focus should not be on the metrics and instead on the users and the relationship. In fact, if you hone in on only authoritative sites, your link footprint will appear very unnatural. Basically, it is unnatural to have ALL authoritative links and that does not look good to the search engines.
There was a great post by James Finlayson on moz last year. This post will hopefully do a better job explaining this than my brief answer:
http://moz.com/blog/how-guest-bloggers-are-sleepwalking-their-way-into-penaltiesAlthough the post was about guest blogging, it definitely pertains to your situation as well. Imagine you arbitrarily decide that the lowest DA you want is 30. Your link graph will look a ton like the unnatural one under the Link Quality section. Instead, just focus on building relationships and links where it simultaneously profits users and your brands and the outcome is guaranteed to be good.
-
All of the numbers are borderline arbitrary. I would simply state; "Don't be stupid." That will get you further than anything.
-
Chasing the domain authority for links is not really as straight forward as that.
- Domain has DA of 50 but not really related, is starting to crack with poor content etc
- Domain has DA of 15 is spot on in terms of directly related content and quality. It's a new site and looks like it will develop.
- A good link profile will have a natural mix of low to high DA, with an upturn for the most common 30-50 DA. This will include a good sprinkling of no-follow. When researching I tend to filter 25+ but keep an open mind on everything.
For cleaning up a profile you can't look at those measurements, you need to go into each site and manually check out it's content and history, its own link profile and make a judgement if you are in the right neighbourhood. Of course directories, 'comments' links and badly placed 'articles', thinly veiled paid for (do-follow) are much easier to weed out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Common Page Links
Hi everyone - I've taken over SEO for a site recently. In many cases, the reasons why something was done were not well documented. One of these is that on some pages, there are lists of selections. Each selection takes the user to a particular page. On the list page, there is often a link from an image, a name, and a couple of others. Each page often has 30 items with 4 links each. For some reason, the 4th of these internal links were no-followed. When I run this site through several different site evaluation tools, they are all troubled with the number of no-follow links on the site. (These instances from above add up to a 5 figure number). From a user perspective, I totally get why there is a link where each of these links exist. If I wanted to click on the image or the name or some other attribute, that totally makes sense. Its my understanding that Google / Bing are only going to consider the 1st instance. If this creates excessive links, wouldn't you want 3 of the 4 links in each set no-followed? If its only excessive unique links that really matter, then why would any be nofollowed.
Technical SEO | | APFM0 -
Optimizing internal links or over-optimizing?
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?
Technical SEO | | max.favilli0 -
How do you perform your link audits?
What methods and tools do you guys use to perform link audits? Do you also use a traffic light system for links?
Technical SEO | | PurpleGriffon0 -
Find broken links in Excel?
Hello, I have a large list of URL's in an excel sheet and I am looking for a way to check them for 404 errors. Please help! Adam
Technical SEO | | digitalops0 -
Google Links
I am assuming that the list presented by Google Webmaster tools (TRAFFIC | Links To Your Site) is the one that will actually be used by Google for indexing ? There seem to be quite a few links that there that should not be there. ie Assumed NOFOLLOW links. Am I working under an incorrect assumption that all links in webmaster tools are actually followed ?
Technical SEO | | blinkybill0 -
How not to lose link juice when linking to thousands of PDF guides?
Hi All, I run an e-commerce website with thousands of products.
Technical SEO | | BeytzNet
In each product page I have a link to a PDF guide of that product. Currently we link to it with a "nofollow" <a href="">tag.</a> <a href="">Should we change it to window.open in order not to lose link juice? Thanks</a>0 -
How to measure number of links out from a page
Following on from earlier Q, what do you all use to count links out from a page. I believe there is a bing tool which does this, though rather than a list of sites a simple number would be ideal?
Technical SEO | | seanmccauley0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0