"link_count" column in Crawl Diagnostics report
-
On the Crawl Diagnostics report, does "link_count" represent external (links to this URL), internal, both, or what ?
-
Rock and roll!
Glad you got it all figured out Glenn.
Mike
-
OK. I think I get it
For the URL in question, the "link_count", Title and Meta Description exactly match the custom 404 page, so it looks like there is no page for this URL. The reason it was picked up in the crawl is because a link to this exists in the "referrer" page.
If I get them to correct the referrer page, this should be good.
(First day using SEOMoz Pro)
Thanks much for your help !
-
The site is returning a custom 404 page. That is why SEOmoz and Screaming Frog are returning a 200.
You need to define that page to return a 404 or fix the page.
This article will hopefully shed some light on your situation.
Mike
-
Could you give an example of a URL that goes to a 404.
Edit: NM I see above.
-
Mike - Yep. Screaming Frog also shows a 200 Status Code. So I have to assume the page exists -- altho not sure why I'm directed to a 404 page...
So basically, I think you and George answered my original question: "link_count" represents links on a page pointing to other internal and external pages.
I would appreciate any thoughts on why I'm ending up on a 404 page tho...
-
No Mike. This is a client's site. An example of these URLs is: http://www.teamflexo.com/home/contact_us.asp, which shows a link count of 43.
Good thought tho, I'll take a look at this on Screaming Frog.
-
Are we talking about your gfwebsoft website that you have listed in your profile?
Using Screaming Frog, the only 404 status code I am seeing is from the homepage, contact, costs, about, testimonials, and services pages that are pointing to your facebook page.
Do you have specific URLs you can share that are 404ing?
Mike
-
If these are on-page links, then I have another question...
I had originally assumed that if the page showed up in Crawl Diagnostics, it must actually exist (as opposed to being a URL in a backlink somewhere) but there are several URLs showing "link_count" of 40+ that, when you go to the URL, it goes directly to a 404 page. (However, the "http_status_code" in the diagnostics report is showing 200.)
Any theories that could help me understand this ?
Tx, Glenn
-
It refers to the number of followed links on the page pointing to other pages on your site or other sites.
Source: Using MozBar to compare numbers.
-
Hi Glen,
It looks like those numbers represent the number of hyperlinks (internal and external) on that specific page.
I was able to validate this by looking at the link_count column of 100+ and verifying the same numbers on my Too Many On-Page Links report on the SEOmoz Crawl Diagnostics.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Do Different Tools Report 404s Differently?
Hi Mozers, How come Moz reports just six 404 errors, whereas Google Search Console reports 250 and Screaming Frog only reports a dozen? It seems to me that these results are all over the place. Shouldn't these reports be more consistent? I do understand that Search Console includes historical data and that URLs or issues need to be "marked as fixed" in order for them to go away, however, even if I do this, Google ends up reporting far more errors than anything else. Do 404s reported by Moz and Screaming Frog NOT include external links? It seems to me that this could be partially responsible for the issue. Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck. Any help is appreciated. Thanksabunch!
Moz Pro | | EricFish0 -
1 page crawled - again
Just had to let you know that it happend again. So right now we are at 2 out of the last 4 crawls. Uptime here is 99,8% for the last 30 days, with a small downtime due to an update process at the 18/5 from around 2:30 to 4:30 GMT In relation to: http://moz.com/community/q/1-page-crawled-and-other-errors
Moz Pro | | alsvik0 -
Crawl Diagnostics - Historical Summary
As we've been fixing errors on our website, the crawl diagnostic graphs have been showing great results (top left to bottom right for errors). The problem is the graphs themselves aren't very pretty. I can't use them in my internal reports (all internal reports are standardised colours/formats). Is there anyway of exporting the top level summary with historic data so the graphs can be recreated in company colours? I don't want the detailed CSV breakdown of what errors occurred, but rather than on X date there were Y errors, the next month Z errors and so forth. The data must already be in the SEOMoz system in order to create the graphs themselves - I was hoping this can be made available to us if it isn't already? Does anyone know if there is already a way of doing this? I've tried to 'inspect element' and find the underlying data in the source code but to no avail, and can't see any exports that would do this. Thanks in advance Dean
Moz Pro | | FashionLux0 -
Erroneous "Weekly Keyword Ranking & On-page Optimization Report" For Campaign
Hi, I just received an email alert from Seomoz telling me my "Weekly Keyword Ranking & On-page Optimization Report " for the period 11/06/12 - 11/13/12 is ready. It is just a copy of the previous report though, all rankings and ranking changes are the same. What is up with that? Best regards, Martin
Moz Pro | | TalkInThePark0 -
Crawl Diagnostics returning duplicate content based on session id
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL. I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
Moz Pro | | csingsaas0 -
Only crawling one page
Hi there, A campaign was crawling fine, but at the last crawl, for some reason, SEOmoz can only crawl one page... any ideas? If I run a custom crawl I still access all of the site's pages.
Moz Pro | | harryholmes0070 -
Why Is SEOMOZ No Longer crawling All Of My Site
Hi all, I joined Seomoz over a month ago and Roger has been crawling all of the pages on the site approx 20 pages. Through out the last few weeks I have been working on the errors and notices identified by Roger. However, this week Roger has only re-crawled 1 page and is not picking up all the other pages. Has any one come across this problem. can you recommend any thing to resolve it? Many thanks in advance....
Moz Pro | | Dan280 -
Crawl completed but no report available for download?
I put 2 crawls in the same day. One came back and delivered a report that I could download. The other one is completed (says so on the page) but there's no way for me to download the report. How do I get a hold of it? Thanks!
Moz Pro | | LiliArancibia0