Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
-
Hi,
I have been asked to complete some SEO contracting work for an e-commerce store.
The Navigation looked a bit unclean so I decided to investigate it first.
a) Manual Observation
Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links.
Ouch! My SEO knowledge is telling me this is non-optimal.
b) Link Sleuth
I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this -
Level Pages
0 1
1 42
2 860
3 3268
Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source).
Question: How are search spiders going to read the site? Like in (1) or in (2).
Thankyou!
-
Well, external links to pages are 80% of the ranking factor for the page. That has nothing to do with your internal nav link structure. But, yes, internal juice will flow 1/750th to each page that the nav structure points to.
-
From an SEO perspective, what about inbound links to the catalogue page? Wont the link power be spread over 750 links and then making the sub sub sub catalogue pages as equally powerful as the sub catalogue pages?
It just seems "the spread of juice" will not be pyramidal.
-
Yikes. 750 links (even if split into sections so you don't see them all at once) on 1 page is not 'human-friendly' (as I know you know). Is that really necessary? I looked at the site and having navigation sub-menus that go 4-deep is kind of a usability issue for people that aren't great with a mouse. Maybe look at ebay's menu structure; they certainly have a large db of products in many categories and don't resort to 100s of links per page just for navigation.
But from and SEO point of view I'm not sure the 750 links are hurting your on-page ranking for a phrase that the page is otherwise optimized for. If you had a page optimized for "widgets" and you did all the correct on-page things for 'widgets' and then had external links pointing to the page with anchor text of 'widgets' then I'm not sure how much you'd be penalized for having 750 links on that page. Given that on-page factors are only about 20% of the ranking equation anyway, I'm not sure it's a huge deal from an SEO/ranking point of view. It's more of a human usability thing, IMO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Links to Disavow?
I am looking through my website's link profile that I pulled directly from Google Webmaster Tools. What is the best way to determine the links to disavow? Maybe the Webmaster Tools list is not the best list for this process but I really need to clean up the links that are hurting the site's SEO. Does anyone have any insight?
Intermediate & Advanced SEO | | PartyStore0 -
Questions About Link Detox
Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l
Intermediate & Advanced SEO | | Kingalan10 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
Link Building Question
Hey Moz'ers, I have created several blogs on different domains for the purpose of writing good content articles that contain 2-3 links per article that go back to my website. It has been up for about 3-4 weeks. I am not seeing my results/links showing up in OSE, is this because it still needs more time or is there something else I could be advised to look into? In theory these blogs will only contain 2-3 links from each domain to the site. I was also going to make sure the anchor text per link is different (keyword, brand name, random anchor like click here). Side note: How does this system sound as part of one small aspect to link building? red flags? Thanks for all the responses and advice.
Intermediate & Advanced SEO | | MonsterWeb280 -
Worthwhile to have global footer links?
About Us, FAQ, Site Map, Terms & Privacy - I know rarely viewed by users (at least on my website), so why keep in global footer across all pages, since one could argue it will dilute some link juice to more important pages?Why not just keep on homepage only? What is argument for having links to these pages across all pages on a site? Is there any SEO value?
Intermediate & Advanced SEO | | knielsen0 -
Looking for a link builder
Hey guys I'm looking for a freelance link builder to work with my agency. Any suggestions would be great. Thanks Jaime
Intermediate & Advanced SEO | | flemingsteele0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0