Hi Amanda,
I don't think the problem is with the tool. The tool is simply reporting what the crawler sees.
Since you have said "an actual blog page, which I am attempting to analyze as a dedicated campaign for my blog" I am thinking that you have set up a separate campaign for what you think of as the "actual blog" and designated the campaign as a subdomain?
If this is the case, then presumably, you also have a campaign set up for what you think of as the main site. So, as Ryan and Brian mentioned, you have two copies of all content in your blog, but it exists in two different locations on your server, so is being seen by two different campaigns with different URL's.
The 100 links per page is a recommended rule of thumb to protect usability and avoid degrading the value of individual links on the page. So, as both Ryan and Brian advised, it is much more important right now to deal with the duplication issues on your site.
Hope that helps,
Sha