"Too many links" - PageRank question
-
This question seems to come up a lot.
70 flat page site. For ease of navigation, I want to link every page to one-another.
Pure CSS Dropdown menu with categories - each expanding to each of the subpage. Made, implemented, remade smartphone friendly. Hurray.
I thought this was an SEO principle - ensuring good site navigation and good internal linking. Not forcing your users to hit "back". Not forcing your users to jump through hoops.
But unless I've misread http://www.seomoz.org/blog/how-many-links-is-too-many then this is something that's indirectly penalised by Google because a site with 70 links from its homepage only lets each sub-page inherit 1/80th of its PageRank.
Good site navigation vs your subpages are invisible on Google.
-
Think pretty much any Javascript menu would be obfuscated for Google..... but bit of a grey hat approach though. Been reading some interesting articles about pageRank sculpting and whether it's still possible.
-
James,
I'm with Mat. I believe in user experience. There is a way around the too many links, you need a developer that understand Jquery. Basically, the Too Many Links issue gets resolved that way- I have an insane amount of too many links- we know have less than the 100.
Chad
-
If that is what makes sense then do it.
Adding a second tier of structure would allow you to direct more rank to certain areas (tier 1 all get an equal share of rank, having more links in some categories than others would force more page rank towards those). However the overall effect of that is less overall rank reaching the bottom pages.
Personally I would go with user experience first. If linking to all 70 in the menu makes sense then do that. What is the point in even ranking a site that people don't want to use because it's a pain to navigate? Then use cross linking to add greater emphasis to those that you want to reinforce.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
"Unnatural links to your site" manual action by Google
Hi, My site has been hit by a "Unnatural links to your site" manual action penalty and I've just received a decline on my 2nd reconsideration request, after disavowing even more links than I did in the first request. I went over all the links in WMT to my site with an SEO specialist and we both thought things have been resolved but apparently they weren't. I'd appreciate any help on this so as to lift the penalty and get my site back to its former rankings, it has ranked well before and the timing couldn't have been worse. Thanks,
Intermediate & Advanced SEO | | ishais
Yael0 -
Pagination and matching title tags - does it matter when using rel="prev" and "next" attributes?
I'm looking at a site with the rel="prev" and "next" HTML attributes in place, to deal with pagination. However, the pages in each paginated category have identical page titles - is this an issue? Rand gives an example of how he'd vary page titles here, to prevent problems, though I'm not entirely sure whether this advice applies to sites with the rel="prev" and "next" HTML attributes in place: https://moz.com/blog/pagination-best-practices-for-seo-user-experience Any advice would be welcome - many thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Spammy Link Profile Questions. What do you think?
I'm trying to dilute the link profile for a website. But have a couple of questions on the best way to achieve this. Current link profile, www.mysitename.com Keyword 1 Keyword 2 Keyword 3 Keyword 4 Keyword 5 Keyword 6 Keyword 7 Keyword 8 Keyword 9 Keyword 10 Keyword 12 Keyword 13 Keyword 14 Keyword 14 Keyword 15 mysitename.com Desired link profile, www.mysitename.com mysitename.com www.mysitename.com http://www. mysitename.com/ My Site Name http://mysitename.com Click Here my site name More Info mysitename.com/ www.mysitename.com/ Keyword 1 Keyword 2 Keyword 3 Keyword 4 Keyword 4 Keyword 5 Questions 1. Do you think Google looks at this on a domain level? Or do you think this needs to be done with every page on the site? 2. What would be a good way to build links fast to the pages, need to build lots of links to be able to dilute the profile. I was considering Dripable, or a similar service, but decided i really don't want to create more spam.What would you do? 3. What would you say the % threshold for anchor text is, i have read on different sources that at least 40% - 60% of links should be branded, url, or generic anchor links. Do you think this is accurate?
Intermediate & Advanced SEO | | 858-SEO0