Any way to tell if a link has been devalued?
-
I have some listings in lawyer directories some of which have very hig PR , links, traffic, etc. For example, www.nolo.com, I know that Google has more or less recently devalued a lot of directory links. I would assume that a monster site like nolo would not be one of those, but does anyone know any way to tell?
Paul
-
There are two quick methods I use to check if a site's ranking has been devalued or penalised;
-
Compare the Page rank of the site in question with MozRank. If MozRank is 1.5+ points different, then the site may have been penalised by google and lost some of it's PR.
-
Pick a sentence from the site of 5-6 words and search for this sentence in google enclosed in quotes. If you find the the site is ranking on the first page, then it probably hasn't been penalised or devalued heavily.
I also use gut instinct for a lot of links and usually that's good measure - If a site look spammy, then it probably will to google. As for nolo.com, it looks like a pretty decent site to me, so I would be surprised if it's links have been devalued.
-
-
One way you could test if a link is passing juice is to set up a page that targets a long tail phrase (maybe 3-4 words) and check the ranking. Then get a link on the site to point to the page and see if the rank jumps higher. If there is a significant jump in rankings (like from #30 to #3 for example) it would suggest that the link still passes juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How serious is Google about internal linking report? Considers the links from sub-directories too?
Hi community members, It's been clearly said by Google to interlink the important pages across the website and they give top interlinked pages in "Links report". They do consider the links from the sub-directories like example.com/blog, etc. to sum up the internal linking . But we do employ multiple sub directories and link to various pages which may not be that important to rank, example "terms of use" page at footer section. So, obviously these non-important pages might be over linked as per the search console "internal links report". Will this make Google to consider the highest linked pages as most important and they try to give ranking importance to them? How about links from sub directories? Please clarify and share your opinions.
Algorithm Updates | | vtmoz0 -
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
Links from high Domain authority sites
I have a relatively uncompetitive niche ranking around number 6 for my keywords. Would getting a few links from some Moz DA 80-90 and DA 90-100 sites help my rankings a lot? Some of the pages linking to me from these sites might be deep in the site pretty far away from the home page with pagerank of "unranked" or a grayed out bar and these pages linking to me might not have many links at all other than from the internal links of the site itself and would have a Moz PA of 10 or 20. Would these pass much pagerank or authority to my site or would they not be worth going after? These links to my site would be in context on a blog. Thanks mozzers!
Algorithm Updates | | Ron100 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
Best way of seeing how many links come from individual root domain domains.
Just wondering how best to see this - which tool to use. I'm dealing with a website with several thousand inbound links from around 100 root domains. Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
So, useless link exchange pages still work?!
After 3 years out of SEO I thought things might have moved on, but apparently not. Bit of back link research and all the top sites in my niche have tons of reciprocal links to barely relevant sites. Do I really have to do this? I mean I thought this was so out of date, it's not much better than keyword stuffing. So, should I just forget my lofty principles asking myself 'is this of any value to my users?' and just take the medicine?
Algorithm Updates | | Cornwall0 -
Javascript hidden divs, links to anchor content
Hello, I am working on a web project that breaks up its sections by utilizing hidden divs shown via javascript activated through anchor links. http://www.janandtom.com/ First question: Is this SEO suicide? I have confirmed that the content is being indexed by searching for specific text but have been led to believe that hidden div content will be afforded a lower 'importance'. One suggestion has having the text as display:block and then hiding it on page load. Will this make a difference? Second: Is there any way to have Google index the anchored content by the specific anchor text? An example for the second question: If you search google right now for: buyers like to look at floorplans Tom & Jan You will get a link to: http://www.janandtom.com but I would rather it be: [http://www.janandtom.com/#Interactive Floorplans](http://www.janandtom.com/#Interactive Floorplans) Sorry if this is redundant or addressed before. I tried searching the questions but wasn't getting and definitive direction to go and this project is a little unique for me. Also, I'm just getting my feet we into this 'high-end' seo (new member of SEOMoz) so please bear with me. Any help would be greatly appreciated. Thanks!
Algorithm Updates | | MASSProductions0 -
Link analysis task
Hi mozzers, I am currently working on a phd, and one of the professors asked me for help. He would like to know how many Danish school websites (n=1500) links to a certain section of a government website (the relevant section has around 1600 pages). The problem is, that the government website is coded very poorly from an seo perspective with lots of strange URL variables, entailing OSE can't give valid data. So, what would be the best way to check how many of the school websites link? Throw all 1500 website through Xenu, or is there a smarter solution? Maybe the link out feature on Bing? Any suggestions will be greatly appreciate. Thanks!
Algorithm Updates | | ThomasHgenhaven0