Why are the bots still picking up so many links on our page despite us adding nofollow?
-
We have been working to reduce our on-page links issue. On a particular type of page the problem arose because we automatically link out to relevant content. When we added nofollows to this content it resolved the issue for some but not all and we can't figure out why is was not successful for every one. Can you see any issues?
Example of a page where nofollow did not work for...
http://www.andor.com/learning-academy/4-5d-microscopy-an-overview-of-andor's-solutions-for-4-5d-microscopy
-
ahhh, duh! Dr. Pete shed light on what we should be thinking about here. You're not getting messages for sending out too much PR but for too many links. He's right; nofollow will not remove them from being counted. Nofollow stops PR from being passed.
Link equity is a broader concept than PageRank. Link equity considers relevance, authority and trust, link placement, accessibility, any value of relevant outbound links, etc. It sounds as if you need to focus more on how you implement the links on your site.
If you need to reduce links, as mentioned earlier, use AJAX as an external file if those links are needed on the page. If they don't offer any value, then remove them. I viewed your page earlier but cannot access it now. They didn't appear to help the user experience anyway. Often what's good for the user is good for Google.
-
The main issue with too many on-page links is just dilution - there's not a hard limit, but the more links you have, the less value each one has. It's an unavoidable reality of internal site architecture and SEO.
Nofollow has no impact on this problem - link equity is still used up, even if the links aren't follow'ed. Google changed this a couple of years back due to abuse of nofollow for PageRank sculpting.
Unfortunately, I'm having a lot of issues loading your site, even from Google's cache, so I'm not able to see the source code first-hand.
-
I don't see 197 on that page I only see 42 external followed links. See the screenshot below:
-
This suggestion for AJAXing the tabs would put the content in a separate file. Such would be a great way to guarantee a reduction in on-page links!
Also, the suggestions to clean up those meta tags and the massive VIEW STATE are spot on. A little optimization will go a long way to ensuring the bots crawl all your pages. If you do have speed issues and crawl errors, it could be that the bots are not getting to subsequent pages to read your nofollows. Just a consideration of the whole pie.
-
Yes, would nofollow all the links.
To address the mystery, are you sure your other pages have since been crawled? Or is it that you are still getting warnings after subsequent crawls?
-
Whoa! Your view state is HUGE (That's what she said).
I couldn't decode it but somewhere along the lines the programmer didn't turn off session management and, likely, the entire copy of the page is encoded in the view state. This is causing load speed issues.
-
You meta tags are in more trouble then your link count:
id="MetaDescription" name="DESCRIPTION" content="Page Details" />
AND
name="Description" />
I see you are using DNN: what version and what module are you using? There are a ton of things one can do to DNN to make it SEO enhanced.
-
My suggestion is to try AJAXing the tabs. If the outbound links are more of a concern then the keywords of the link, AJAX loading of the tab content would remove them from consideration. Google won't index content pulled in from an external source.
However, be careful to put a rel="nofollow" on the link that loads the content as you don't want SEs indexing the source.
Do not put a meta nofollow in the head, it will kill all links on the page and seriously mess up your link flow. Your use of rel="nofollow" is correct in the context of the specific link tags.
I wouldn't sweat the shear number of links - the 100 count is a left over from the days when spiders only downloaded 100k from the page. It has since risen to the point that the practical limitations of over 100 links is more pressing (IE, do you visitors actually value and use that many links?)
If each link is valuable and usable, no need to worry. If not, perhaps there is a structural way to reduce the count.
Also, load the footer by AJAX onscroll or on demand. Assuming all of the pages can be found in the top navigation, the bottom links are just exacerbating your issues. Primarily, this section is giving far too much weight to secondary or auxiliary pages.
For instance, your Privacy Policy only needs to be linked to where privacy is a concern (ie the contact form). Good to put it on the home or about pages too if you have a cookie policy.
-
Hi Karl,
Would this suggestion not stop crawling to all links on the page?
Also, the issue is we have seen the rel='nofollow' work on other pages and reduce our warnings but then for some pages it has not. This is where the mystery lies.
-
it may be how the nofollow tag is formated? It should be;
and yours is rel='nofollow'
-
Hi James,
Thanks for responding. The issue is that we are still getting a link count of 197 on page links when there is not this many links on the page.
-
What do you mean the nofollow did not work? I noticed on the example page that some of your external links in the papers section are nofollow while the videos are not nofollowed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
Should we rel=nofollow these links ?
On our website, we have a section of free to low-cost tools that could help small business increase their productivity without spending big bucks. For example, this is the page for online collaboration tools: http://www.bdc.ca/EN/solutions/smart_tech/tech_advice/free_low_cost_applications/Pages/online_collaboration_tools.aspx None of the company pay anything to be on these list. We actually do quite a lot of research to chose which should be listed there and which should not. Recently, one of the company in our lists asked us to add rel=nofollow to the link to their website because they add been targeted by a manual action on Google and want their link profile to be as clean as possible (probably too clean). My question is : Should we add rel=nofollow to all these links ? Thanks, Jean-François Monfette
Technical SEO | | jfmonfette0 -
Home page indexed but not ranking...interior pages with thin content outrank home page??
I have a Joomla site with a home page that I can't get to rank for anything beyond the company name @ Google - the site works fine @ Bing and Yahoo. The interior pages will rank all day long but the home page never shows up in the results. I have checked the page code out in every tool that I know about and have had no luck....by all account it should be good to go...any thoughts/comments/help would be greatly appreciated. The site is http://www.selectivedesigns.com Thanks! Greg
Technical SEO | | DougHosmer0 -
Page for Link Building
Hello Guys, My question is about a link building process. We all know that some directories/sites do require a reciprocal link. Does it make any sense to creat a page in website exclusively to reciprocal links? And what we do with this webpage in terms of indexing, do folow, crawling...etc. Any sugestions are more then welcome 🙂 Tks in advance! PP
Technical SEO | | PedroM0 -
Nofollow links appear to be still included in SEOMOZ crawl and Google
I have added the nofollow tag to links throughout my site to hide duplicate content from Google but these pages are still being shown in my SEOMOZ crawl. I also fetched an example page with the Googlebot within Webmaster tools and it showed all nofollow links. An example is http://www.adventurepeaks.com/news All News tags have nofollow but each tag is appearing in my SEOMOZ crawl report as duplicate content. Any suggestions on whether this is a problem or if i have applied the tag incorrectly? Many thanks in advance
Technical SEO | | adventure340 -
302 Redirect of Home Page - Lost Link Juice?
I've got a new client whose site is set up to redirect the home page to another URL, like www.example.com, redirecting to www.example.com/home/, using a 302 redirect. Am I correct in assuming that links pointing directly to www.example.com are not passing their full value because it is being redirected with a 302 rather than 301? (In a nutshell, I want to make sure this resource is still accurate: http://www.seomoz.org/learn-seo/redirection)
Technical SEO | | kpclaypool0 -
Duplicate Page Content Lists the same page twice?
When checking my crawl diagnostics this morning I see that I have the error Duplicate page content. It lists the exact same url twice though and I don't understand how to fix this. It's also listed under duplicate page title. Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Does this have anything to do with a 301 redirect here? Why does it have http;// twice? Thanks all! | http://www.charlottepersonalassistant.com/ | http://http://charlottepersonalassistant.com/ |
Technical SEO | | eidna220 -
Will frequently adding and frequently removing pages from my site hinder any SEO?
Hi Guys, Just looking through our crawl diagnositcs and we have a ton errors, well over 5000 actually, on 404 pages that cannot be accessed. Our website runs a lot of "Hot Offers" that are time bound, so they expire at the end of each month and we remove the page via our CMS. It's making the crawl diagnositcs loook bad, but will this be hindering our seo and Google 'stuff' because they are finding thousands of 404 errors? Any advice would be greatly appreciated! Website: www.vospers.com Lee Greenhill
Technical SEO | | lee_greenhill0