"Too Many On-Page Links" Issue
-
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/
Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
-
Marcus and Alan offered some very good advice.
I've visited your website before and had a good user experience.
I'd only add two suggestions.
The Itineraries Tab could be a one-click to the Itineraries landing page. It is easy to locate individual names in the Left Nav and main events/conferences on the page.
Get Google Page Speed for Firefox. Love the wealth of information and fixes it offers. I like to copy and paste the page results into Notepad. You can see scores for each issue--helps prioritize what to fix first.
HTH
-
Some great stuff there - thumbs up!
-
While having all those links in the main nav site-wide isn't necessarily helpful to SEO, I routinely deal with clients that have twice as many links in their drop-downs. So I agree with Marcus - it shouldn't be causing you serious problems.
In addition to Marcus' suggestion, I would recommend a couple things to work on:
1. Sub-navigation
Like you have in the Staff section http://www.ibethel.org/staff - by adding in sub-navigation (links within that section that appear on each page of that section, and only for those pages) you reinforce content relationships.
2. Referrer Code in URLs - Duplicate Content
You've got links that have their URL appended to track referrer source right on the site. On the home page - the graphic call-out box (large graphic with 3 smaller graphics to the right of it). These links point to internal pages but stick
"?referrer=front_page" on the end. And since your graphics rotate, that means every page linked from this area is generating its own duplicate content conflict. While some programmers say it's fine, it's not. You could mitigate this by pushing canonical tags across the entire site, however best practices dictate to eliminate this referrer tracking method completely - Google only uses canonical tags as a signal and not all search engines use it.
3. Site Speed
I ran a couple site speed tests - you've got some serious issues regarding crawl speed. Your home page images are huge files, and you take a second hit with lots of Javascript file loading times. - More than one check came in at over 13 seconds - that's not user speed - it's crawlability by search bots. I highly recommend looking further into that. If it takes search bots too long to crawl just one single page, some of the links on that page will not be found during any single crawl - and this alone could be part of your problem. Look to have those graphic images compressed without losing quality. Look to have your Javascript files compiled.
4. Store Issues
Your category pages (http://store.ibethel.org/Childrens-Resources/c236/index.html for example) have no paragraph based descriptions so you're missing out on an important way to properly optimize those category pages for SEO.
5. Content Organization
Also in the store - your Books section organization is a bit messed up. So your breadcrumbs are broken. For example on this sub-category page http://store.ibethel.org/Books-Books/c173_25/index.html the breadcrumb shows Store Home > Catalog > Books > Books when it should be Store Home > Books. Except when I go to the top level "Books" link in the breadcrumbs I get an empty page http://store.ibethel.org/Books/c173/index.html
6. Store Page Titles
In the store, when there's more than one page of products in a category, if I click down to the 2nd or 3rd page in that category, I get the same page Title. Every page title should be unique, so the simplest way to fix that is to append page titles with the page number. Like Music then Music | Page 2 then Music | Page 3 and so on.
7. Stronger Titles
Your page titles in the store could be much stronger - Having a page TItle of "Music" isn't going to allow your site to rank extremely well given how competitive that single word is. You'd be better off with titles like:
Christian Music | Online Christian Music Downloads
Or for "Childrens Resources" a better page Title might be:
Christian Parenting | Christian Education Reources
8. Store Home Page Duplicate
Both http://store.ibethel.org/index.html and http://store.ibethel.org/ work - so that's duplicate content - have the index.html version set up with a 301 redirect pointing to the http://store.ibethel.org version.
-
Hey Blairr
Can you explain the problem a little more as I can't really see anything wrong with the site.
In your case, the 130 nav links is not the end of the world as you don't have a massive site (category, sub category, product etc). Also, I have done various searches for your title tag and the name of your church and you have great visibility in those results with a 1st place homepage listing, a google map listing, site links and various other pages so nothing to worry about there.
I also did a search for a snippet of text from one of your deep testimonial pages "but I trust it will be encouraging to all. In March" - what was interesting was that it returned three pages, two of which were from another site called healingherald.org.
So, you may find that these pages are not going to rank so well as this content already exists on another site which is ranking above you for the content.
If you have ranking issues, I would investigate these first and depending on who owns the content either remove these dupe pages from the index or investigate using a canonical URL to the other site (or from the other site to you if it is your content).
Certainly, that is where I would start rather than the navigation as whilst there is a general 100 links a page rule, I would not obsess about that in the overall hierarchy of this site.
Hope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with structured data on angular pages.
I am having an issue with Structured Data. I have added the structured data to angular pages of my site but when I run the test from the testing tool it doesn't detect the same. Although when I cut and paste the code (from inspect element) it detects the structured data. But in my webmaster tools, those pages don't show up under structured data. I am unsure if my structured data is being picked up by google. What should be done here? Should I provide pre-rendered pages to google?
Technical SEO | | Lybrate06060 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Leveraging "Powered by" and link spam
Hi all, For reference: The SaaS guide to leveraging the "Powered By" tactic. My product is an embeddable widget that customers place on their websites (see example referenced in link above). A lot of my customers have great domain authority (big brands, .gov's etc). I would like to use a "Powered By" link on my widgets to create high quality backlinks. My question is: if I have identical link text (on potentially hundreds) of widgets, will this look like link spam to Google? If so, would setting the link text randomly on each widget to one of a few different phrases (to create some variation) avoid this? Hope this makes sense, thanks in advance.
Technical SEO | | NoorHammad0 -
Meta data & xml sitemaps for mobile sites when using rel="canonical"/rel="alternate" annotations
When using rel="canonical" and rel="alternate" annotations between mobile and desktop sites (rel="canonical" on mobile, pointing to desktop, and rel="alternate" on desktop pointing to mobile), what are everyone's thoughts on using meta data on the mobile site? Is it necessary? And also, what is the common consensus on using a separate mobile xml sitemap?
Technical SEO | | 4Ps0 -
Too many on-page links vs. UX issue
I am having an issue with many of our pages having too many on-page links. I have gotten many of them below the 100 page limit that is suggested and I understand this is not a critical factor with SEO, but my issue is this: Many important pages I am trying to optimize are buried at a "3rd" level which is actually not accessible from the home page navigation dropdown due to our outdated CMS. I am trying to decide if we should develop our site to display these pages on-hover from the main navigation. This would make a lot of sense since users would find these pages easier, however adding this functionality would increase on-page links by a lot more. So in your opinion, would it be worth it to spend the money to have this functionality developed? Or would it end up hurting our SEO standings?
Technical SEO | | isret_efront0 -
Is a Rel="cacnonical" page bad for a google xml sitemap
Back in March 2011 this conversation happened. Rand: You don't want rel=canonicals. Duane: Only end state URL. That's the only thing I want in a sitemap.xml. We have a very tight threshold on how clean your sitemap needs to be. When people are learning about how to build sitemaps, it's really critical that they understand that this isn't something that you do once and forget about. This is an ongoing maintenance item, and it has a big impact on how Bing views your website. What we want is end state URLs and we want hyper-clean. We want only a couple of percentage points of error. Is this the same with Google?
Technical SEO | | DoRM0 -
Is it better to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Technical SEO | | CustomOnlineMarketing0 -
Hyphenated Domain Names - "Spammy" or Not?
Some say hyphenated domain names are "spammy". I have also noticed that Moz's On Page Keyword Tool does NOT recognize keywords in a non-hyphenated domain name. So one would assume neither do the bots. I noticed obviously misleading words like car in carnival or spa in space or spatula, etc embedded in domain names and pondered the effect. I took it a step further with non-hyphenated domain names. I experimented by selecting totally random three or four letter blocks - Example: randomfactgenerator.net - rand omf act gene rator Each one of those clips returns copious results AND the On-Page Report Card does not credit the domain name as containing "random facts" as keywords**,** whereas www.business-sales-sarasota.com does get credit for "business sales sarasota" in the URL. This seems an obvious situation - unhyphenated domains can scramble the keywords and confuse the bots, as they search all possible combinations. YES - I know the content should carry it but - I do not believe domain names are irrelevant, as many say. I don't believe that hyphenated domain names are not more efficient than non hyphenated ones - as long as you don't overdo it. I have also seen where a weak site in an easy market will quickly top the list because the hyphenated domain name matches the search term - I have done it (in my pre Seo Moz days) with ft-myers-auto-air.com. I built the site in a couple of days and in a couple weeks it was on page one. Any thoughts on this?
Technical SEO | | dcmike0