"Link_count" in the SEOMOZ crawl report
-
What will be the advisable "Link_count" that you in the Pro crawl report?
-
Hi NYC,
I am guessing you are referring to the "Too many on-page Links" warning in the crawl report?
The accepted best practice is no more than 100 links on a page, but keep in mind that what you are seeing is a best practice recommendation, not an "error". So, if there is no reasonable way to reduce the number of links without lessening the user experience, then you may need to just accept that you can't comply with best practice in that case.
If, on the other hand, you see that there are unnecessary links on the page, you can reduce the number and this may even improve the experience for the user.
In general, if the links are predominantly from menus and necessary elements like buy buttons etc, then I would not be too worried.
If the page is full of internal text links trying to pass link juice with anchor text, then this would be something to fix. Two reasons:
- It is not a good experience for the user as it makes the whole page look and feel really "spammy".
- The amount of link juice passed by a link which is one of more than a hundred on the page is so miniscule that it is pointless anyway.
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Both links with ".html" and without are working , Is that a problem ?
Default format of my url ending with ".html" , I know it's not a problem .. But both links with ".html" and without are working , Is that critical problem or not ? and how to solve it ?
Technical SEO | | Mohamed_Samer0 -
Webmaster tools reporting spurious errors?
For the past 3 or so months Webmaster tools has been reporting 404 errors on my pages... The odd thing is that I can't figure out what they are seeing. Here is an example of a link they claim is a 404 antiquebanknotes/nationalcurrency/rare/1895-Ten-Dollar-Bill.aspx This is strange because it's a malformed URL. It says it's linked from this page: http://www.antiquebanknotes.com/antiquebanknotes/rare/1882-twenty-dollar-bill.aspx Which is a URL that doesn't exist. The bolded portion of this URRL shouldn't be there. Can anyone give me an idea what is happening here? Kind regards, Greg
Technical SEO | | Banknotes1 -
Handling "legitimate" duplicate content in an online shop.
The scenario: Online shop selling consumables for machinery. Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with. The Problem: Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z. Machine A page = Machine B page = Consumables range A page Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc). If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic. However, if I don't use canonical tags then all the pages get slammed as duplicate content. For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format. However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same. The Question: What is the best way to handle this from both a user and search engine perspective?
Technical SEO | | Serpstone0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
"Spam emails" : ranking drop?
Hello, Is it possible that a website gets penalised by Google because your hosting company blocked you from sending emails? Basically I got a message from my hosting company saying that they were blocking me from sending emails from our server and domain because too many had mistakes or were complained about. The same day we dropped from 2<sup>nd</sup> on a keyword to about 600<sup>th</sup> while still being ranked for other keywords. The drop was for our main keyword. Can the fact we sent “bad emails” be related to a rank drop? For the record, those were confiormation emails for account creation, they were legit, not spam. That's off-topic though.
Technical SEO | | EndeR-0 -
We are still seeing duplicate content on SEOmoz even though we have marked those pages as "noindex, follow." Any ideas why?
We have many pages on our website that have been set to "no index, follow." However, SEOmoz is indexing them as duplicate content. Why is that?
Technical SEO | | cmaseattle0 -
301 Redirect "wildcard" question
I have been looking at the SEOmoz redirect guide for some advice but I can't seem to find the answer : http://www.seomoz.org/learn-seo/redirection I have lots of URLs from a previous version of a site that look like the following: sitename.com/-c-25.html?sort=2d&page=1 sitename.com/-c-25.html?sort=3a&page=1 etc etc. I want to write a redirect so whenever a URL with the terms "-c-25.html" is requested it redirects to a specified page, regardless of what comes after the question mark. These URLs were created by our previous ecommerce software. The 'c' is for category, and each page of the cateogry created a different URL. I want to do these so I can rediect all of these URLs to the appropraite new cateogry page in a single redirect. Thanks for any help.
Technical SEO | | craigycraig0