What does it mean that "too many links" show up in my report - but I'm not seeing them?
-
I've noticed that on the crawl report for my site, www.imageworkscreative.com, "too many links" is showing up as a chronic problem.
Reviewing the pages cited as having this issue, I don't see more than 100 links. I've read that sometimes, websites are unintentionally cloaking their links, and I am concerned that this is what might be happening on my site.
Some example pages from my crawl report are:
http://www.imageworkscreative.com/blog/, http://www.imageworkscreative.com/blog/10-steps-seo-and-sem-success/index.html, and http://www.imageworkscreative.com/blog/business-objectives-vs-user-experience/index.html.
Am I having a cloaking issue or is something else going on here? Any insight is appreciated!
-
Thanks, everyone! I appreciate the help!
-
If you read in the on page optimization tool, it is inconsistent with the crawl tool.
"Avoid Excessive Internal Links
Employing an excessive quantity of internal-pointing links may not directly harm the value of a page, but it can influence the quantity of link juice sent through those links and dilute it's ability to help get link targets crawled, indexed and ranked.
Recommendation: Scale down the number of internal links to fewer than 100 (preferrably), and, at a minimum, fewer than 300"
That said the 100 links rule is a "Warning" (Yellow) and not a Error (Red). It is still confusing.
Here is also a Matt Cutts video that refutes the 100 links
http://www.youtube.com/watch?v=l6g5hoBYlf0
Seems like Moz needs to update its messaging around this item.
-
Yeah Mike is right on as usual here.
I just want to point out a quick way to find out how many actual links are sitting on any given page (keep in mind this won't be exact but it'll be close.)
USING CHROME:
- Right click the page and select "View Source"
- Hit CTRL+F
- Type in<a href <="" span=""></a>
<a href <="" span=""></a>
<a href <="" span="">Boom. You'll have yourself a number of results and that's how many links you have, cloaked or not cloaked, give or take.
This is easier to look at I feel like and a fun little (maybe obvious, sorry if so) tip.
Good luck!</a>
-
Hi Jess,
Using Screaming Frog, it looks like your /blog page actually has 131 links. If you add up your footer (30), plus links to your homepage (6), plus pagination (9), plus Link Building and Content article (5), and your Alex Bogusky Video article (6) - you already have 50+ and that is not including top and side navigation, as well as the rest of the articles on your page.
Matt Cutts sums things up really well in this article saying:
"...Google will index more than 100K of a page, but there’s still a good reason to recommend keeping to under a hundred links or so: the user experience. If you’re showing well over 100 links per page, you could be overwhelming your users and giving them a bad experience. A page might look good to you until you put on your “user hat” and see what it looks like to a new visitor.
But in some cases, it might make sense to have more than a hundred links. Does Google automatically consider a page spam if your page has over 100 links? No, not at all. The “100 links” recommendation is in the “Design and content” guidelines section, and it’s the Quality guidelines that contain the things that we consider webspam (stuff like hidden text, doorway pages, installing malware, etc.). Can pages with over 100 links be spammy? Sure, especially if those links are hidden or keyword-stuffed. But pages with lots of links are not automatically considered spammy by Google.
So how might Google treat pages with well over a hundred links? If you end up with hundreds of links on a page, Google might choose not to follow or to index all those links. At any rate, you’re dividing the PageRank of that page between hundreds of links, so each link is only going to pass along a minuscule amount of PageRank anyway. Users often dislike link-heavy pages too, so before you go overboard putting a ton of links on a page, ask yourself what the purpose of the page is and whether it works well for the user experience."
Hope this helps.
Mike
-
I agree with Linda. It looks like you only 60 or so hyperlinks, so you should be okay there. But, I think it was something like 120 or so @imports.
-
If you look at your source, there are a lot of @import and javascript urls; perhaps this is what is being picked up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Needs clarification: How "Disallow: /" works?
Hi all, I need clarification on this. I have noticed we have given "Disallow: /" in one of our sub-directory beside homepage. So, how it going to work now? Will this "Disallow: /" at sub-directory level going to disallow only that directory or entire website? If it is going to work for entire website; we have already given one more Disallow: / at homepage level blocking few folders. How it is going to handle with two Disallow: / commands? Thanks
Web Design | | vtmoz0 -
Website Drops Some Traffic after Redesign. What's Happening?
What it is NOT: No Link was broken. I have used Moz, Screaming Frog, Excel, etc - there are not broken links. We have not added spammy links. We kept the same amount of links and content on the homepage - with an exception of 1 or 2. All the pages remained canonical. Our blog uses rel=prev rel=next, and each page is canonicalized to itself. We do not index duplicated content. Our tags are content="noindex,follow" We are using the Genesis Framework (we were not before.) Load time is quicker - we now have a dedicated server. Webmaster tools has not reported any crawl report problems. What we did that should have improved our rankings and traffic: Implemented schema.org Responsive design Our bounce rate is down - Average visit length is up. Any ideas?
Web Design | | Thriveworks-Counseling0 -
Link juice passing from a .org.uk link to a .org/uk websites
Hi all, A client I am working on had a CMS built in recently which has resulted in all their canonicals tags being taken off the website, and as such the same page with both a .org/uk and .org.uk/uk domain have appeared in the search results and I am wondering what your guys take is on the best cause of action. For further background: Historically they have always used .org.uk/uk (not sure why) for their UK website and used .org/xxx for other countries (they also have a .org splashpage FYI). Having seen the .org/uk pages, and knowing they have to choose one to avoid duplication, they would like to move their uk website to the .org/uk domain to fit in with the rest of the divisions. However due to the historical use of .org.uk/uk their backlink profile contains links to both the .org.uk and .org domains. My question then: would a canonical tag on all the .org.uk/uk pages pointing to the .org/uk pages be strong enough to pass on link juice to the .org/uk pages (from all links pointing to .org.uk) or would a 301 redirect be required in this instance, or indeed would it be best to stay with the .org.uk/uk domain? Thanks, Diana
Web Design | | Diana.varbanescu0 -
Does having a Blog link in the top level navigation provide any better SEO value, or would having it in a footer or top navigation work just as good?
Trying to decide on whether placing a link to the blog in our top level navigation would have a better SEO value than just placing it in top or footer navigation. I have an ecommerce site.
Web Design | | RPD0 -
Why do site links appear under one keyword and not another? Any ideas?
Hi everyone, I have a client whose website is doing the strangest thing. When I search the branded keyword (the company name), Google doesn't show any site links under the result. However, when I search for the company name plus Inc., I do get site links. Now, the website is ranking first in both searches, so that's not the issue. And, as near as I can tell, the site only contains one or two uses of the company name plus the word "inc." Most of the text on the page and all of the meta data only uses the company name, and most of the links that connect to the site use only the company name as well. Even the Who Is for the site doesn't use the term "inc." And ideas what might be going on? I know Google says that the process is still automated but for the life of me I can't figure out what kind of automated process would result in these results. Thanks! Megan (Rebecca's minion)
Web Design | | RebeccaRalston0 -
Number of links per page?
I'm confused by the number of links that we should put on a page. Our site has a high domain authority but SEOmoz tool and others, plus Google WMT suggests much much less than other sites have - look at Dailymail.co.uk or the Huff post site for example. our site is www.worldtravelguide.net and I'm thinking specifically about the /destinations and each continent like /europe Our site has thousands of pages, trying to create an effective internal linking structure with the limitation of 150 or so links is nearly impossible and ends up with too many navigational pages. We were hit hard by Panda (even though all our content is original, professionally written frequently updated) in favour of bigger brands and considering Google suggests that sites should be designed for users and not SEO these two ideals conflict. Does anyone have any data on what the link limit is? Any other tips or observations would be gratefully received. Thanks, John
Web Design | | JohnFinlayson0 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0 -
Best way of conserving link juice from non important pages
If I have a bunch of non important pages on my website which are of little use in the SE's index - IE contact us pages, pages which are near duplicate and conflict with KW's targetting other pages etc, what is the best way of retaining the link juice that would normally be passed to these pages? Most recent discussion I have read has said that with nofollow you effectively just loose link juice, as opposed to conserving it, so that doesn't seem a great option. If I do "noindex" on these pages, would that conserve the link juice in the site, or again would it be just lost? It seems quite a tricky situation as many pages are legitimate for customer usability, but are not worth having in the SE's index and you better off consolidating link juice - so it seems you are getting penilised for making something "for users". Thanks
Web Design | | James770