Too many on page links
-
Our home page (and 1400 of our other pages) have well over 100 links, going beyond the recommend amount. Our competitors have less on page links (to other pages on their site) and way more link popularity so we are trying to figure out the best solution for this without hurting our sites conversions and usbaility.
-
We have a similar problem, we have a javascript dropdown nav at the top which brings the user to every product category on the site (+100), but we also also have the main categories listed in the left column. So we have used "no follow" for the javascript dropdown nav, and also other pages like shopping cart etc, so leaving the spider to use the main categories listed in the left column to crawl the site . But not sure how effective or useful that would be
-
FYI guys we removed about 100 of the internal links and rankings jumped 3 big positions overnight. They had not moved in over 2 months. Coincidence? I think not.
-
I've seen big name "expert seos" say that 150 links should be fine on a large site so they still think you should try and keep them at a reasonable level so 100 isn't a hard and fast rule. Like many things in SEO it is a guideline.
-
Thanks guys, what about that javascript drop down (view all departments). Should I make those noindex-nofollow?
-
I would start by identifying the specific pages you want link juice to go to most, and those that you don't need juice to go to. That should give you a starting place to see what links to reduce or cut.
-
For you guys, I would really recommend focusing on off-page ranking factors rather than on-page ranking factors. On-page ranking usually only accounts for about 30% of all SEO - off page factors count for about 70.
You need more links, need a strong social media presence, and need to be publishing tons of high quality content about trade shows, trade show marketing, etc.
Your on-page SEO looks great - get rocking on off-page!
-
Thanks yes, that is the point of what I'm trying to accomplish. We want to maintain more of the link juice that the home page has. However, not sure which of the elements on our site should be tweeked. We have a javascript drop down, the left side navigation, and all the products and tabs on the center of the home page. I think the javascript drop down is being crawled and counted still but not sure.
-
100 isn't really a hard and fast recommendation any more. I do not think your page has too many links to be user-friendly. From an SEO standpoint, see these resources:
http://www.mattcutts.com/blog/how-many-links-per-page/
http://www.youtube.com/watch?v=l6g5hoBYlf0I don't think you necessarily need to reduce your links on the page. However, I would definitely limit links to less important pages so that you maximize the link juice that flows to your important pages.
-
The "too many outgoing links" (less than 100 links per page) was originally created because Google could only index 101KB of a page, and so they needed some sort of rule that would help it's users. However, this rule is no longer in place, and the guideline has been removed. I would suggest you don't have 50000 links on a page, but your website looked fine.
Wow, I love your website - can you send me an email to alhallinan@gmail.com with some info?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many broken links which i am unable to understand
Dear All, There are some links which are cropping up again and again in my google webmaster tools. I am using wordpress and genesis theme. Both are upgraded to the latest version. But there are 3 types of errors being created as mentioned below. tag/target-markets/%E2%80%9Dhttp:/www.wordpress.org%E2%80%9D tag/smartphone-apps/%E2%80%9Dhttp:/%E2%80%9Dhttp:/www.marketing91.com%E2%80%9D how-to-maintain-customer-interest/%3C?php%20the_permalink()%20?%3E As you can see, either wordpress.orf or marketing91.com or php%20 the _permalink is being added at the end of URLs and they are breaking. This cannot be seen in google index. But this indicates wrong coding and due to it my traffic is regularly dropping. Can you tell me what is causing the break in URL like above. My website is http://www.marketing91.com Thanks in advance. This has being going on for 2 months 🙂 Hope someone comes out with the right answer.
On-Page Optimization | | hith2340 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
Please review page for on page SEO
I'm working on an adult dating site review blog and am hoping to build links to the category pages. It's a wordpress blog so the posts are categorised by topics. However can you tell me if there is enough content on the page for google to identify what the page is about? The problem is that the content of each category pages is pulled from the posts. What I think is needed is some static text on the page 2-300 words that use the keyword. Here's an example of a page targeting the keywords 'online dating'. http://www.top-10-dating-reviews.com/category/online-dating/
On-Page Optimization | | SamCUK
[editor's note -- thumbnails on site may not be appropriate for some workplaces] Will the page rank like this if there are a good amount of authorative links or does the page itself need work?0 -
Is the on-page link count a distinct count or a gross count?
My Crawl Diagnostic report is showing too many on-page links (~120-141). Is this a count of distinct links (i.e. example.com and example.com/?q count as 2)?
On-Page Optimization | | SpartzAlison0 -
Spammy link for each keyword
Some people believe that having a link for each keyword and a page of content for each keyword (300+ words) can help ranking for those keywords. However, the old approach of having "restaurant New York", "restaurant Buffalo", "restaurant Newark" approach has become seen as a terrible SEO practice. I don't know whether this was because it's spammy or because people usually combined it with thin content that was 95% duplicate. Which brings us to; http://hungryhouse.co.uk/ Why does such a major company have the following on the site (see the footer); Aberdeen Takeaway Birmingham Takeaway Brighton Takeaway Bristol Takeaway Cambridge Takeaway Canterbury Takeaway Cardiff Takeaway Coventry Takeaway Edinburgh Takeaway Glasgow Takeaway Leeds Takeaway Leicester Takeaway Liverpool Takeaway London Takeaway Manchester Takeaway Newcastle Takeaway Nottingham Takeaway Sheffield Takeaway Southampton Takeaway York Takeaway Indian Takeaway Chinese Takeaway Thai Takeaway Italian Takeaway Cantonese Takeaway Pizza Delivery Sushi Takeaway Kebab Takeaway Fish and Chips Sandwiches Do they know something I don't? [unnecessary links removed by staff]
On-Page Optimization | | JamesFx0 -
Canonical home page
I have a site that shows duplicate page content for: www.autoserviceexpertsonline and www.autoserviceexpertsonline/index.html When looking at the files using the cms (intuit) file manager, I only see the /index.html version. I added the Caononical tag referencing/pointing to both the domain name only and then changed to .../index.html No matter how I code this, the seomoz On-Site SEO Grader still has a problem with it. Is this a bug with the Grading program or am I doing something wrong? Please help as I think this is causing me problems with Google and I'd like to get this right for future sites I will be working on. Thanks, Bill
On-Page Optimization | | Marvo0 -
Why home page ranks higher than keyword-optimized page
We have a page that is optimized for the keyword "job scheduling". A search on the keyword "job scheduling" results in this page not ranking at all, while our home page (uc4.com) ranks third. Could you provide some ideas/suggestions as to why this would be the case and how to make our job scheduling page rank higher? Thanks, claudia
On-Page Optimization | | claudmar0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5