Too Many On-Page Links
-
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture.
That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation.
I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend.
Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway?
Please advise,
-
First of all, do not, I repeat DO NOT nofollow your own internal pages. Doing so will not result in your other links getting more link value, but will result in link juice evaporation:
http://www.seomoz.org/blog/google-maybe-changes-how-the-pagerank-algorithm-handles-nofollow
If don't want to divide your link equity, the best solution is to simply remove the link. I would encourage you to install a tool such as CrazyEgg or ClickTale that will show you exactly how your users are using your site. Chances are they are not clicking on many of those footer links. Based on that click data, remove the links that aren't being clicked on.
Also, having a few more links than what SEOMoz recommends is not the end of the world, especially for a high Domain Authority site like yours.
-
Make sure you check out this
http://www.mattcutts.com/blog/how-many-links-per-page/
I guess the larger question has to do with the point of having so many links in the footer. I think Google has overcome most of their issues with lots of links on the page but the main question I would have revolves around what is good for users. If it were my site I would probably pare down the links some to make it as user friendly as possible.
Your domainrank of 61 probably has more to do with your incoming link profile than your on page factors. In my experience I would not correlate these two items so closely.
If Moz is telling you to reduce the links then I would probably consider reducing them but at the same time if having more links is better for your users then I would probably stick with it.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Is the on-page link count a distinct count or a gross count?
My Crawl Diagnostic report is showing too many on-page links (~120-141). Is this a count of distinct links (i.e. example.com and example.com/?q count as 2)?
On-Page Optimization | | SpartzAlison0 -
Removing OLD pages
Dear all, I was removing tons of old pages from my directory (about 400 pages), I was setingup a 404 custom page, all is fine, so when I go to an existing page I get a 404 and redirected to my 404 page. The problem is Google Webmaster tools list all these pages as 404, and never clean my list (1 year til now), so I assume something is wrong. Question what is the best way or natural to remove old pages from one directory? Note: previously I tryed add on these pages the NOINDEX/NOFOLLOW meta tag and I got from google Soft-404. Thank you
On-Page Optimization | | SharewarePros0 -
Limiting On Page Links
Right now, we have about 160 or so links on the home page. It's been recommended that we keep it to under 100, though that's not as big of a deal as it once was. Is it helpful to make a bunch of those links "nofollow" in order to preserve link juice? Is it going to make a difference, or be at all helpful? I assume it won't be harmful, especially as a bunch of them are to the same page but on different sections of the page. Would live your advice and thoughts! Thanks!
On-Page Optimization | | DeliaAssociates0 -
Multiple silos/products/landing pages. How to design the root page for conversion?
Hi everyone, First post. Tried a few awkward searches on the topic but I must be using bad keywords. I'm re-designing a site that has multiple products and matching multiple audiences. This means we have multiple sillos for multiple groups of keywords with the supporting pages for each silo landing page. Currently I'm working on updating the look and text of those landing pages for each silo to increase conversion. This leaves me with the root web page. We get quite a lot of search traffic from people searching our brand name - so this results in clicks straight through to our root domain. There are no product specific landing pages because it could be any one of the 3-5 different personas we have hitting the site from that source. Does anyone have any good examples of where a site has had multiple products and needed to segregate their audience on a root top page? I'd like to see some examples and hear peoples thoughts. At the moment I'm thinking I need to fill that page up with trust factors as to why people should use us as a company, along with navigational elements in relation to each and every product so they can click through to the proper landing page. The main way I can see on executing that is to have a rotating banner with the same tag line "this is what we do" but be alternating between banners relating to each product.. with their own click through button to go to the respective landing page. Thoughts anyone? Example of sites doing this well?
On-Page Optimization | | specific0 -
Canonical links
My website is relatively new, January. We climbed steadily to 6th for our search term then overnight rocketed to 1st. This only lasted a week and have been stuck at 9th ever since. When I use the SEO Moz tools our site should theoretically be top...I only joined today btw. Anyway in Google webmaster tools I noticed it said I had duplicate title tags, when I checked to see what the pages were- it was my home page! Google also seems to have cached two versions of our homepage, the root domain and the Default.aspx page. Now I have fixed this canonical linking issue today (using canonical link tag and 301s) so time will tell but has anyone got any first hand experience of this issue? Was it a big factor? Thanks!
On-Page Optimization | | SplashBacksNI0 -
Why is this page ranking highest?
I've just used Open Site Explorer to compare some sites whose (unpaid) Google ranking I aspire to. They all have higher authority than my site, but the top ranking site out of the 3 I've looked at has the lowest Page Authority, hardly and links (when the others have hundreds), lowest page rank and lowest page trust. In fact, when you look at the top ranking page (ranks #1), it does not even have the search term in it as a complete phrase. One thing I do notice is that it does have 100,000s of linking root domains from one linking root domain. So how can it rank number one on Google?
On-Page Optimization | | Beemer0