Individual Link Value
-
We understand the PA, DA, trust and all of that. My question is, is there a process or formula anyone uses that shows an individual links value as to the link juice it passes. The old Domain Juice seemed to be that, but after further investigation (And Rand setting me straight) I now understand it's not a good metric.
Today, we use PA divided by the number of external links on that page to get some sense of an individual links actual value to the site or page we link to. I understand this is a very sloppy system, but seems to be the only choice we have?
It's based on this simple thought. If you get a back link on two different pages, and both are equal in every way, except one has 3 outbound links and the other has 30, the link from the page with 3 will be significantly stronger as far as passing juice.
So... anyone using something to determine an individual links value? I did ask the SEO staff, and they do not current have it.
-
I believe that the problem is.... people are spending too much time worrying about the value of a link and not enough time producing something worth linking to.
-
Nobody knows but Google... but I think that advertising links and reference links (usually) have very different formats.
-
you mention reference citation. My question to you is does the google algo actually look out and read titles such as references ? As it would in a similar way to spot advertising links ?
-
Rather than use numbers I would use qualitative measure....
- How relevant is the site?
- Where is the link on the page? (in footer?... in contextual paragraph?.... in sidebar?... above the fold?.... in reference citation?
- Is the link on a kickass domain or a dog?
I think that these are much more important than numbers.
-
Ryan, yes there are a lot of factors. But .... going back two page example. Two pages of the same value. One has 3 outbound links, the other has 30. Clearly in most cases the page with 3 is better. In other words, if you have this situation 100 times, you might see it proving correct 70 or 80.
I doubt that figuring out several of the SEOmoz metrics is much (if at all) more difficult than the one I am describing, and a Link value would be of huge.
Every metric we see on reports is in fact a guess. PA, DA, Trust, Cblocks, Backlinks.. none can be completely trusted and we all use these tools with that understanding, and the hope that they are at least generally correct. What would be any different in a link value report that took as much into consideration as possible?
Can you imagine the time savings and efficiency acceleration of Link building if such a tool existed and was even somewhat accurate.
I think SEOmoz attempted this with the old Domain Juice passed metric. But seeing the formula for it, I can understand why they felt it was not very helpful.
-
I am not aware of any solid tool that provides this information. You may find a tool which estimates or otherwise provides a link value, but the challenge is that guesses are being stacked upon other guesses.
If someone responds "yes, try the Link Valuation Tool from Company X" my questions would be:
-
What metric is being used to value the link? PA? DA? PR? If PA/DA are being used, then those metrics are limited by the Linkscape crawler and the various factors concerning it's use (i.e. 1-2 months behind, issues mentioned by Carin, etc). If PR is being used, then the tool's PR is a guess and may be quite different from Google's PR
-
How is decay being handled? Is the PA/DA/PR being fully distributed? Or is the natural decay being calculated, and if so how? It's another guess factor.
-
How is the weighting of links being handled? The SEO consensus is that links in content are given more weight then links in footers and other site-wide links.
There are other factors such as multiple links to the same domain, multiple links to the same page, etc. I feel there are too many unknowns for a tool to provide a meaningful link valuation. I would love to be proven wrong. Such a tool would clearly offer great value to SEOs.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Moz takes to update DA and PA value?
I used to work daily on my office website https://quintdaily.com/ and follow link building strategy. however, since more than 40-50 days, there is no movement in my DA, as the PA has just lifted up. However, there is no changes in Domain Authority so far. Besides, other than this website, im working on location wise SEO projects, but, i can find those websites Domain Authority changed in less 20 days. Is there is time period in changing the Domain Authority once it touched DA 30+?
Moz Pro | | seochris172 -
Spammy inbound links: Don't Fix It If It's Not Broken?
Hi Moz community, Our website is nearing the end of a big redesign to be mobile-responsive. We decided to delay any major changes to text content so that if we do suffer a rankings drop upon launch, we'll have some ability to isolate the cause. In the meantime I'm analyzing our current SEO strengths and weaknesses. There is a huge discrepancy between our rankings and our inbound link profile. Specifically, we do great on most of our targeted keywords and in fact had a decent surge in recent months. But Link Profiler turned up hundreds of pages of inbound links from spammy domains, many of which don't even display a webpage when I click there. (shown in uploaded image) "Don't fix it if it's not broken" is conflicting with my natural repulsion to these sorts of referrals. Assuming we don't suffer a rankings drop from the redesign, how much of a priority should this be? There are too many and most are too spammy to contact the webmasters, so we'll need to do it through a Disavow. I couldn't even open the one at the top of the list because our business web proxy identified it as adult content. It seems like a common conception is that if Google hasn't penalized us for it yet, they will eventually. Are we talking about the algorithm just stumbling upon these links and hurting us or would this be something we would find in Manual Actions? (or both?) How long after the launch should we wait before attacking these bad links? Is there a certain spam score that you'd say is a threshold for "Yes, definitely get rid of it"? And when we do, should we Disavow domains one domain at a time to monitor any potential drops or all at once? (this seems kind of obvious but if the spam score and domain authority alone is enough of a signal that it won't hurt us, we'd rather get it done asap) How important is this compared to creating fresh new content on all the product pages? Each one will have new images as well as product reviews, but the product descriptions will be the same ones we've had up for years. I have new content written but it's delayed pending any fallout from the redesign. Thanks for any help with this! d1SB2JP.jpg
Moz Pro | | jcorbo0 -
How to find page with the link that returns a 404 error indicated in my crawl diagnostics?
Hi Newbie here - I am trying to understand what to do, step by step, after getting my initial reports back from seomoz. The first is regarding the 404 errors shown as high priority to fix, in crawl diagnostics. I reviewed the support info help on the crawl diagnostics page referring to 404 errors, but still did not understand exactly what I am supposed to do...same with the Q&A section when I searched how to fix 404 errors. I just could not understand exactly what anyone was talking about in relation to my 404 issues. It seems I would want to find the page that had the bad link that sent a visitor to a page not found, and then correct the problem by removing the link, or correcting and re-uploading the page being linked to. I saw some suggestions that seemed to indicate that seomoz itself will not let me find the page where the bad link is and that I would need to use some external program to do this. I would think that if seomoz found the bad page, it would also tell me what page the link(s) to the bad page exists on. A number of suggestions were to use a 301 redirect somehow as the solution, but was not clear when to do this versus, just removing the bad link, or repairing the page the link was pointing to. I think therefore my question is how do I find the links that lead to 404 page not founds, and fix the problem. Thanks Galen
Moz Pro | | Tetruss0 -
To Many Links on site
I've had an issue with to many links on the site. My drop down menu, secondary footer and footer. The report told me that I had 253 links on each page. I then programmed my secondary footer to dynamic and ran a crawl and my links reduced accordingly to 201. Then turned the footer into dynamic and ran a crawl with my links increasing to 1500. This also happened between each phase but en went away. Oddly enough, my domain authority increased as well as other factors in the crawl report. This too many links thing is driving me crazy. Please provide some guidance.
Moz Pro | | CHADHARRIS0 -
How Old is OSE link data?
I ran an anchor text report for my client today, which shows that their site has some incoming comment spam links using totally unrelated phrases (pharma products). However, when looking for the live link, the linking page no longer contains the link to them. Maybe the webmasters removed these, but I can't track down a single one... how old is this data? thanks
Moz Pro | | JMagary0 -
Is seomoz rogerbot only crawling the subdomains by links or as well by id?
I´m new at seomoz and just set up a first campaign. After the first crawling i got quite a few 404 errors due to deleted (spammy) forum threads. I was sure there are no links to these deleted threads so my question is weather the seomoz rogerbot is only crawling my subdomains by links or as well by ids (the forum thread ids are serially numbered from 1 to x). If the rogerbot crawls as well serially numbered ids do i have to be concerned by the 404 error on behalf of the googlebot as well?
Moz Pro | | sauspiel0 -
SEOmoz not displaying correct amount of links?
When I go to the link analysis page where ti shows how many links my site has and how many my competitors have...it shows that I have 0 links. But Google Webmaster Tools shows my site as having 149 links. Is this a glitch with zeomoz or whats going on? The reason I initially subscribed to seomoz was to track my links. Thanks
Moz Pro | | tarik30010 -
Over 90% of anchor text tends to be brand-name on OSE link profiles. Why?
I reported this as a bug in OSE, because often I explore these links and find that the pages include both a brand-name link AND a regular keyword link, but for some reason OSE was only reporting the brand-name link... This led me to wonder how many links this occurred for, and therefore whether or not to trust the fact that the majority of the sites I ran OSE on returned at least (in most cases, more) than 90% brand-name links. I understand that brand-name links are amongst the most important to obtain, but that it's also important to get anchor text for keywords to build a varied profile. Given this apparent flaw in OSE, is it wrong - in the case of very successful sites - to take this ~90% as being anywhere near the correct percentage of brand-name links that I should be aiming for as a proportion of the total profile? Extra Credit :)... And this may help potentially help resolve the issue: does "Inbound Links" tab in OSE just report links to the Root Domain, or to that and every other page on the site?
Moz Pro | | ZakGottlieb710