How can I fix "Too Many On Page Links"?
-
One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website.
The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation.
You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/
Any suggestions on how to fix this warning?
Thanks!
-
I had the same problem and on some pages I still do, as mentioned above try to reduce useless links such as the one in the footer which may not receive much attention. Analyse it with in-page analytics and you will see if it's worth keeping them or not!
-
How many links does it have? anything sub 300 is nothing to be concerned about in my experience. Google does recognise the need for some sites - particularly large ecommerce sites to have lots of links for navigation. As a rule of thumb (imo) if it aids natural customer navigation, it's fine. If its overtly spammy and a deliberate attempt to cury favour in SERPS, its not fine - and apart from anything will ultimately dilute what you're doing by appearing on every page.
-
Hi,
I can see you have many of the same links in the top dropdown menu and in the footer.
You might reduce the amount of links on the page if you remove the duplicated links.Personally I would reduce the links in the footer.
BR,
Ceran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I data highlight pages that already have schema?
Hi all, I have pages with schema on but there are some gaps. Rather than ask my dev team / wait for the changes to be made, can I use the data highlighting tool in GSC to fill in these gaps? Will it let me add these and will Google generally consider both the schema and the highlighted data? To note, if I have used GSC to highlight data and then test it in Google's Structured Markup Test Tool it won't show so I understand it may be difficult to test whether it's working or not. Any advice would be appreciated. Thanks!
Intermediate & Advanced SEO | | KJH-HAC0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
Fix broken external links on noindex, follow pages no one visits?
Would you take the time to fix external links on your site on pages that are noindex, follow on pages that no one ever visits? The only reason to do it would be to present a tidier site to Google, but would it really care if those pages are noindex/folllow? The thing that makes it a non-trivial amount of work is that there are hundreds of these on a large site. Do you think Google cares, if they're noindex/follow? I know the safe answer is always fix everything, but really it has to get weighed against the likely benefit and other projects with a limited amount of time to work with. Best... Mike
Intermediate & Advanced SEO | | 945010 -
Is this the "Google Dance"?
We just did a site redesign, and removed the noindex, etc. about 10 days ago. Over the last 24 hours, I've gotten some of my top keywords on the first page, but now they are gone, a few hours later. I assume this is typical?
Intermediate & Advanced SEO | | CsmBill0 -
The same "About" page on multiple WordPress microsites
Hello there, I have over 10 WordPress websites that all have the same "About" page because the same company. I have concerns that this will adversely affect my sites and i'm looking for the best way to deal with this. I was either going to remove the "About" page with Google Webmaster Tools and robots.txt or use the canonical meta tag on that page. Any thoughts?
Intermediate & Advanced SEO | | SpaMedica0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0