What is meant by to many on page links
-
I have just done the report for my site http://www.in2town.co.uk and it says i have 246 on page links but i am not sure how come i have got that many. I know i have a large number of links and in the old days it says that you should keep the links under 100 but now with website speed and the net, people are saying this is no longer listened to. A report i read said that the links should not confuse the reader or put them off, so i am just wondering what your thoughts are on a site with over a 100 links on the home page and also if my site does have to many links what should i do about it.
I cannot understand why it is showing 246 when i do not see that many on the page, any advice would be great
-
Hi Tim, Keri and Dr. Pete
http://www.distilled.net/blog/seo/site-navigation-for-seo/
From Seer Interactive. How many links are on the page?
Matt Cutts recently came out with a video that provides some great insight as to how many links on a page Google will digest, but typically if we see over 150-200, we’ll look for a better opportunity.
3. Does PR flow through to the page you’re evaluating?
PAGERANK IS NOT EVERYTHING, but it’s interesting to see if Google places that measurement of value on a page. If a page does have PR, fantastic. If it doesn’t, check other scoring systems like mR.http://forums.searchenginewatch.com/showthread.php?t=10494
http://www.screamingfrog.co.uk/link-building/
Use Seer Interactive's guide when using this tool it is fantastic
http://www.seerinteractive.com/blog/screaming-frog-guide
http://www.bad-neighborhood.com/text-link-tool.htm
Seer has a great Screaming frog guide I think it's probably the best one published
Here are the ways to find out about your internal and external links using Screaming frog
Internal Links
I want information about all of the internal and external links on my site (anchor text, directives, links per page etc.)
I want to find broken internal links on a page or site
I want to find broken outbound links on a page or site (or all outbound links in general)
I want to find links that are being redirected
I am looking for internal linking opportunitiesGood tip
http://www.searchenginejournal.com/visualizing-link-data-with-screaming-frog-and-excel-part-1/43641/
http://www.distilled.net/excel-for-seo/
Use Distilled U or University
http://www.distilled.net/u/
This is not an affiliate linkI have been a member since the beta came out and I cannot speak more highly of it when it comes to learning about technical search engine optimization
http://www.portent.com/library/seo-1/seo-4.htm
http://fatfreeguide.com/link-recover/
Great free tools
http://www.internetmarketingninjas.com/tools/Links are critically important to webpages, not only for connecting to other, related pages to help end users find the information they want, but in optimizing the pages for SEO. The Find Broken Links, Redirects & Google Sitemap Generator Free Tool allows webmasters and search engine optimizers to check the status of both external links and internal links on an entire website. The resulting report generated by the Google sitemap generator tool will give webmasters and SEOs insight to the link structure of a website, and identify link redirects and errors, all of which help in planning a link optimization strategy. We always offer the downloadable results and the sitemap generator free for everyone.
http://www.internetmarketingninjas.com/seo-tools/google-sitemap-generator/
http://www.internetmarketingninjas.com/seo-tools/free-optimization/
http://www.internetmarketingninjas.com/broken-links-tool/
The scan is very fast. Once complete, the free SEO analysis tool presents a Summary table containing the numbers for the following:
HTTP redirects
Broken links
Images with no alt attribute text
Links to PNG files
Links to GIF files
Links to JPEG files
Links to CSS files
Links to HTML files
Links to PDF files
Links to plain files
An Image Details table follows the summary produced by the free SEO tool, presenting the following information on images used within the page:The image file’s path on the site
The native size of the image file in pixels (not the display size specified in the tag)
The file size in kilobytes
The alt attribute text string used in the tag
The image viewer control
To use the view image functionality, pause the mouse pointer over the magnifying glass icon at the end of the table row to display the image.Next up, the Link Details table, offering the following details on anchor tags used within the page:
The URL of the link (includes both internal and external links, linked to the page)
The file type (CSS, HTML, PDF or Plain)
The HTTP status code returned for the URL
The link check result (OK, Redirect or Broken)
Notes:
If a link is redirected or is broken, the line for that link in the Link Details table is shown in a red font.
All HTTP codes other than 200 and all check results other than OK are linked to the HTTP Response Code Checker Tool for more information.
Check another URL
If you want to check the status of images and links on another URL using the free SEO analysis tool, type or paste the URL you want to scan in the text box at the bottom of the report tables, and then click Ninja Check.Internet Marketing Ninjas is pleased to offer this free SEO analysis tool to SEOs and webmasters. Did you find our free SEO tool useful? Be sure to check out the other valuable SEO tools available online.
Links to image graphics and HTML files enhance the value of a webpage to both human users and to search engines. But if there are broken links (resulting in 404 errors), incorrect redirects are used (for SEO purposes, we mean temporary 302s), or image files are excessively large (thus weighing down page load speed), they can be detrimental to the search optimization of a page. Use the Check Image Sizes, Alt Text, Header Checks, and More Free SEO Tool to get a fast SEO report on the status of images and links used within a webpage.
To scan a webpage, type or paste the URL in the free SEO tool’s text box, and then click Ninja Check.
The scan is very fast. Once complete, the free SEO analysis tool presents a Summary table containing the numbers for the following:
HTTP redirects
Broken links
Images with no alt attribute text
Links to PNG files
Links to GIF files
Links to JPEG files
Links to CSS files
Links to HTML files
Links to PDF files
Links to plain files
An Image Details table follows the summary produced by the free SEO tool, presenting the following information on images used within the page:The image file’s path on the site
The native size of the image file in pixels (not the display size specified in the tag)
The file size in kilobytes
The alt attribute text string used in the tag
The image viewer control
To use the view image functionality, pause the mouse pointer over the magnifying glass icon at the end of the table row to display the image.Next up, the Link Details table, offering the following details on anchor tags used within the page:
The URL of the link (includes both internal and external links, linked to the page)
The file type (CSS, HTML, PDF or Plain)
The HTTP status code returned for the URL
The link check result (OK, Redirect or Broken)
Notes:
If a link is redirected or is broken, the line for that link in the Link Details table is shown in a red font.
All HTTP codes other than 200 and all check results other than OK are linked to the HTTP Response Code Checker Tool for more information.http://www.internetmarketingninjas.com/seo-tools/free-optimization/
This web page optimization tool analyzes existing on page SEO and will let you see your website’s data as a spider sees it allowing for better web page optimization. This on page optimization tool is helpful for analyzing your internal links, your meta information and your page content in order to develop better onpage SEO. In the guide below, we’ll explain how to maximize the potential of this free SEO tool to improve your website’s on page SEO.&
I hope this was what was needed,
Thomas -
thank you for this, i will try this now, what a great idea.
-
One thing I'd suggest is potentially trying a click-mapping tool like CrazyEgg, to see where your home-page is getting traction. We all tend to over-estimate the user value of our own content, and what you'll often find is that a small percentage of your links are getting most of the activity. If you find that 2-3 sections are getting no love at all, it's a lot easier to start cutting based on that actual usage data.
-
i have counted 104 links not including the moving screen, but the tool i have used says there is still 155 links, so i am missing somewhere 51 links
-
hi, i found the problem. i am using a news content module to display my news and for some reason they have stupid settings on it that brings in more links. the tool for joomla is called module news show pro gk4.
With this module you have to go through each one and make sure that you are not asking it to display links to other pages.
now i have reduced my links to 174, i know have to decide, do i reduce my links more by removing content to try and get under the 100 link rule or do i listen to the new article that is stating that the 100 link rule does not really matter now. However, as has been mentioned on here, if i have over 100 links then google is not going to crawl them and cause me seo damage, on the other hand, thinking about the user experience, the page would look silly if i removed a large number of them.
-
I'm not seeing that source code on your home-page, but the problem is usual links that are hidden in some way, such as drop-down menus or scrolling images. For example, each picture that scrolls by at the top has its own link. Only one is visible at a time, but Google sees them all in the source code simultaneously. There's nothing necessarily wrong with that, but they do all count. Once you start adding these up, the numbers can go up fast.
-
thank you for this. I have just looked at my source code and i am very puzzled. I have been trying to understand why tools are showing that i have links on my home page that are not there, and i have done what you have done and counted the links in the source code and there are more links there than on the home page, so i do not how this is happening.
example
Tuesday 2 July. ANDY’S FEELING THE HEAT. As Andy worries about the prospect of losing his house, amidst all the problems he...p><p class="<a class="attribute-value">nspInfo nspInfo2 tleft fleft</a>">p>div><div class="<a class="attribute-value">nspArt nspCol3</a>" style="<a class="attribute-value">padding:0 20px 20px 0;clear:both;</a>"><h4 class="<a class="attribute-value">nspHeader tleft fleft</a>"><a href="[/emmerdale/emmerdale-debbie-does-not-know-who-to-trust](view-source:http://www.in2town.co.uk/emmerdale/emmerdale-debbie-does-not-know-who-to-trust)" title="<a class="attribute-value">Emmerdale Debbie Does Not Know Who To Trust</a>">Emmerdale Debbie Does Not Know Who To Trusta>h4><a href="[/emmerdale/emmerdale-debbie-does-not-know-who-to-trust](view-source:http://www.in2town.co.uk/emmerdale/emmerdale-debbie-does-not-know-who-to-trust)" class="<a class="attribute-value">nspImageWrapper tleft fleft</a>" style="<a class="attribute-value">margin:6px 6px 9px 0;</a>"><img class="<a class="attribute-value">nspImage tleft fleft</a>" src="[/images/soap/emmerdalrobbie4.jpg](view-source:http://www.in2town.co.uk/images/soap/emmerdalrobbie4.jpg)" alt="<a class="attribute-value">Emmerdale Debbie Does Not Know Who To Trust</a>" style="<a class="attribute-value">width:140px;height:82px;</a>" />a><p class="<a class="attribute-value">nspText tleft fleft</a>"> the above is part of the source code, so i do not understand why these links are showing as on my home page when they are not on my home page, this brings me to the question, the tools i have used to find out how many links are on my home page are showing more than are actually on the home page so i need to get this sorted
-
I'm seeing 200+ links in the source code for the home-page. From a usability standpoint, my first instinct is that this is overwhelming. If you're a big brand or major news site, you might be able to pull it off, but most people just can't digest this many options on a site they aren't familiar with, and you risk that they'll just give up and leave.
From an SEO standpoint, though, it's mainly an issue of dilution. The more pages you link to, the less internal PageRank gets passed to each page. If you have a lot to go around, that may be fine, but for most sites, you'll spread yourself too thin. Essentially, you're telling Google that all of these pages are important. A hierarchical approach can help signal the most important pages and can help Google prioritize (both in terms of crawl and ranking).
I don't believe that there's a hard cut-off at this point (the 100-link rule was based in technical limitations that no longer exist), but I do agree with Thomas that crawler fatigue is a real issue that can negatively impact your site.
-
just wanted to give you some information here. i have used the screaming frog tool and it is showing links from other pages rather than just my main page, which looks like this is the reason why the other tool is showing over 240 links
do you know the reason for this, or is this a case that google will be following all the links on my site from the home page and counting them all as links.
there is no way i can get under 100 if google is counting all the links on my site
-
Thomas,
Do you have some resources that can give details about knowing for a fact it will hurt your rank?
Thank you for pointing out Dr. Pete's post at http://moz.com/blog/how-many-links-is-too-many regarding how many links is too many, as that is a good discussion about this report.
-
counted on my page now i have stopped the images being linked as 148 but the tool is showing me 234 so i need to find out where these extra links are coming from
-
thank you for the great advice, i am going to get started on this straight away and please do pm me over the trusted names as i really want to get this right. we had the site re-developed and put on the new joomla 3.0 and since we had it updated, we have lost rankings.
for example, we were on page one for lifestyle magazine, we are now on page 8, we were on page one for gastric band hypnotherapy, we are now out of the top 50, we were on page one for showbiz gossip, we are again out of the top 50, so i do not know what the developer has done or the reason behind this but i need to find out.
many thanks for all the great advice
-
I verified your results using your URL you have approximately 273 links in one page of your website. This must be changed immediately
I tested your site273 links found on http://www.in2town.co.uk/ http://www.feedthebot.com/tools/linkcount/result.php?url=http%3A%2F%2Fwww.in2town.co.uk%2Fhttp://www.feedthebot.com/tools/spider/test.php?url=http%3A%2F%2Fwww.in2town.co.ukThis was simply to test if you were showing up using the author tags or any Rich snippets this is crucial to rank today and you can see the results yourself with the website. I would recommend a SEO as well as a developerGooglehttp://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.in2town.co.ukIn order for you to succeed with Google. If you need help beyond this I am more than happy to assistance sincerely, Thomas
-
Yes in fact I know you're hurting your rank. If you have over 100 Internal links per a page just internal links Is what I'm talking about. Not Links from other websites pointing towards yours the Problem is if you have more than 100 internal links Google but does not believe that the site is not constructed well or just has poor architecture if you have 200 internal links per page on average for even on one page that page is not is well built so Google Bot will not waste time and CPU power to crawl it has to crawl on the sites on the web and it delegates certain amounts of power each website from internal links then you're not going to pass link juice through your website efficiently.Think of it like Pac-Man and if he eats too many of the little blocks he stops moving and goes to the next webpage so you wouldn't want that to happen and then you would have essentially only half of your site really read by Google and Google does not pose this penalty for absolutely every website if you are extremely high ranking and I mean a 7+ PageRank about you will most likely get off the hook however if your site is not that highly ranked you are going to receive a negative out come because after sites being not being crawled by Google efficiently or effectively like every other website with less than 100 linksFirst off here's the tool you can use at no cost. It works on Mac, Windows, and Linux is free to use from 500 pages and will tell you how many internal links you have on your website (and much more) so you can doublecheck it. Here's the link to that and other articles regarding thishttp://www.screamingfrog.co.uk/seo-spider/A second web-based tool you can use to check your internal linkshttp://www.feedthebot.com/tools/linkcount/Here is Matt Cuttshttp://www.mattcutts.com/blog/how-many-links-per-page/Moz bloghttp://moz.com/blog/how-many-links-is-too-manyFeed the bot another article talking about the importance of keeping internal links under 100 for a sitehttp://www.feedthebot.com/howmanylinks.htmlI definitely sympathize with you on this. However your best bet is to call a qualified developer. If you like I can give you some trusted names.A great to help her move you to help you with this is Greg Reindellhttp://www.gregreindel.com/Other options are webdevstudios.comif you want to go real high-end happycog.com is an outstanding companyI hope to you sincerely, Thomas
-
thank you for this, i will keep an eye on which links are not being pressed as much and remove them
-
how do you mean i am hurting my rank. i will look and see if there any links that i can remove, do you feel i should not have any links on the image, so instead of people being able to click the image to go to the article, do you think it should just be the title they should click on
-
unless you have a page rank of 7+ Google bot will not take the time to index more then 100 to 150 links. Best to stay safe & keep then under 100. You need a web developer to help you. You hurting you rank keeping it as it is. I hope this was of help,
thomas
-
Also check out this thread which discusses the same question: http://moz.com/community/q/too-many-on-page-links-31
-
On-page links (or internal links) are links pointing from Page A (in this case your homepage) to any other page on your site.
Currently, your homepage is linking to many other pages within your site. But I think this is a normal practice. Take a look at any other popular site and you'll see the same thing (ex. http://www.huffingtonpost.com/). What I would suggest though, is to analyze user behavior and identify what links are being clicked on the most. If some of your links are never clicked on, is it really useful to have them there?
Howard
-
yes it is www.in2town.co.uk
-
Can we see the page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Canconical tag on site with multiple URL links but only one set of pages
We have a site www.mezfloor.com which has a number of Url's pointing at one site. As the url's have been in use for many years there are links from many sources include good old fashioned hard copy advertising. We have now decided that it would be better to try to start porting all sources to the .co.uk version and get that listing as the prime/master site. A couple of days ago I went through and used canonical tags on all the pages thinking that would set the priority and that would also strengthen the page in terms of trust due to the reduced duplication. However when I went to scan the site in MOZ the warning that the page redirects came up and I am beginning to think that I need to remove all these canonical tags so that search engines do not get into a confused spiral where we loose the little page rank we have. Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at.
Technical SEO | | Eff-Commerce0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Page feedback
We recently wrote a new website page to cover the direct mail services our organization offers. We kept the title tag to 70 characters, the meta description under 150 characters. H1 tag has what we feel is the most important term. If anyone out there has time to review & provide a little feedback, we'd really appreciate it. It would be great to know if it is built well and providing a solid end user experience. http://www.cushingco.com/print_products/additional_services/direct_mail.shtml At the moment, the only links pointing to this page are from our blog. One bit of content I am contemplating is a short paragraph - What is Direct Mail Marketing? Literally providing a short definition of it. The page was activated last Thursday and showing up in some Google results on the 4th/5th page but I am thinking this is probably just a temporary bump for now. Anyway, thanks in advance for any advice!!!
Technical SEO | | SEOSponge0 -
No-follow links on advertising pages
Hi I run a job board that enables employers to post job vacancies and information about their organisations. These are 'paid for' pages (advertising) on our site. These link out to their own websites. My question is, would it be better for these links out to their sites to be no-follow? From my site's perspective, I cannot necessarily dictate the quality of their websites (although the majority are leading firms) as I would in article and feature content, where we do happily link out and refer to other quality sites with information that gives readers further information. I know that many large job boards do this where they run listings of feeds from other sites, but should we also do this at the page level where the link out is effectively paid for. What would be the pros and cons if I do or if I don't use no-follow? I hope this makes sense and look forward to some replies. Many thanks
Technical SEO | | CelestialChook0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0