Number of links per page?
-
I'm confused by the number of links that we should put on a page. Our site has a high domain authority but SEOmoz tool and others, plus Google WMT suggests much much less than other sites have - look at Dailymail.co.uk or the Huff post site for example. our site is www.worldtravelguide.net and I'm thinking specifically about the /destinations and each continent like /europe Our site has thousands of pages, trying to create an effective internal linking structure with the limitation of 150 or so links is nearly impossible and ends up with too many navigational pages. We were hit hard by Panda (even though all our content is original, professionally written frequently updated) in favour of bigger brands and considering Google suggests that sites should be designed for users and not SEO these two ideals conflict. Does anyone have any data on what the link limit is? Any other tips or observations would be gratefully received. Thanks, John
-
Bing suggest 250 links. if you had 250 links pointing to submenu pages that is 250*250 = 62,000 pages only 2 clicks from home page.
Bing suggst 250 max or they may be ignored and not indexed.
http://perthseocompany.com.au/seo/reports/violation/the-page-contains-too-many-hyperlinks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
H Tags for an Events Page
I wanted to get the thoughts of people here about how to best structure an events listing page for SEO. I have a list of events, all with dates, event titles, location name, city and zip. What I do currently is listed below. I also show a version for how I could revise it, but it would require me to duplicate the event date on the page. Any ideas, suggestions or best practice examples you can point me to would be greatly appreciated. Current Structure <state>Events - H1 Tag
Web Design | | abiondo
Friday, December 5, 2014 - H2 Tag
Event Title 1 - H3
Location Name, City, State - P Tags Event Title 2 - H3
Location Name, City, State - P Tags</state> I was wondering if I would see better results by doing the following instead. The benefits I see of this approach are the event titles are h2 instead of h3 tags and the con I see is duplicating the event dates <state>Events - H1 Tag</state>
Event Title 1 - H2
Friday, December 5, 2014
Location Name, City, State - P Tags
Event Title 2 - H2
Friday, December 5, 2014
Location Name, City, State - P Tags thanks, Anthony0 -
Footer link back to developers domain
I have read a lot about where it is suggested to either not put an attribute link in the footer of a clients site or to no follow it. But I have a little bit different take on the question. How does this work on a large scale? Are these manual penalties, or are they automatic? By large scale, I am talking about big cms programs such as Wordpress, Joomla, and the likes of those. They all have links back to their site in the footer of the default templates. Is this bad? Does it not rally matter on the scale of companies such as this?
Web Design | | LesleyPaone0 -
What does it mean that "too many links" show up in my report - but I'm not seeing them?
I've noticed that on the crawl report for my site, www.imageworkscreative.com, "too many links" is showing up as a chronic problem. Reviewing the pages cited as having this issue, I don't see more than 100 links. I've read that sometimes, websites are unintentionally cloaking their links, and I am concerned that this is what might be happening on my site. Some example pages from my crawl report are: http://www.imageworkscreative.com/blog/, http://www.imageworkscreative.com/blog/10-steps-seo-and-sem-success/index.html, and http://www.imageworkscreative.com/blog/business-objectives-vs-user-experience/index.html. Am I having a cloaking issue or is something else going on here? Any insight is appreciated!
Web Design | | ScottImageWorks0 -
Page Title Optimization
I am reviewing the optimization on my site and it appears that my page titles follow this method: PAGE_NAME | KEYWORD in CITY ST - COMPANY_NAME I am pretty well optimized for "KEYWORD in CITY ST" but am wondering if I should drop it from all page titles except for the pages that actually deal with that keyword. What are your thoughts on optimizing?
Web Design | | nusani0 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
Solutions for too many links on page (Ecommerce)?
Hello Mozzers, Most Ecommerce websites I've come across have four main link sections - Main Nav - About, Contact etc Side Nav - List of Categories + Products Footer - Useful links etc Promotional Area - Promoting Best sellers / Latest products This ends up totalling anything from 200 to 500 links. I was wondering is there a reasonable solution to hide some of the links? Or should I just ignore the warning? Thanks, Dan
Web Design | | Sparkstone0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090