SEOMOZ Diagram question
-
Hi,
On this SEOMOZ help page (http://www.seomoz.org/learn-seo/internal-link) the diagram explaining the optimal link structure (image also attached) has me a little confused.
From the homepage, if the bot crawls down the right-hand link first, will it not just hit a dead end where it cant crawl any further and disappear?
OR... will it hit the end of the structure and then crawl backwards to the homepage again and follow down another link and then just repeat the process until all pages are indexed?
Cheers
-
In a vacuum, yes. However hopefully you'll be linking in and out anyway. Like most things in SEO it is good to understand the principal without being a slave to it.
If one area is picking up lots of links then fantastic. You could link back around the site to spread that link equity. Better still - try to ensure it is your money pages that are getting the incoming links!
-
Great reply thanks very much, that made sense.
This is the optimal structure SEO wise but from a user experience point of view not the best, what kind of problems would interlinking level 2 cause?
Also if level 3 on the left somehow picked up lots of inbound links are you not locking juice into 1 silo?
I have read a little about this and Rand mentions interlinking where relevant to unlock some of this juice and pass it about a little across silos.
But then do you not just end up with what you was trying not to do in the first place?
Thanks again for the great reply.
-
Nice question.
Search engine bots are many headed beasts. When they read a page they will note what links are on that page and add them to their list to crawl. They might then follow several or them (or none at all) and come back later and start with the next URL on their list.
Instead of thinking of the bot like a visitor who is deciding where to go next, think of pouring sand in to the top. It'll flow down every connected route.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question About Permalink Showing Up in Search Results
Does Google determine how your permalink shows up in the search results or is that a setting on our end? I noticed most of our competitors have their permalink show up in their snippet results. Ours shows "knowledgebase" instead. I think seeing the keywords in the permalink helps with conversions. https://screencast.com/t/fyFyNaWayajx
Intermediate & Advanced SEO | | LindsayE0 -
HTTPS - implementation question
Hello, I am looking at a site on which they haven't 301'd http to https, so each URL is there whether you have http or https at the beginning. Why would a site owner not 301 to https? Is there any logical reason not to use 301? This particular website is simply using a canonical tag to point to the https version of each URL.
Intermediate & Advanced SEO | | McTaggart0 -
To subdomain or to subfolder, that is the question.
Hi All, So I have a client that has two restaurants that they are wanting two sites for. Right now they have one site for their two locations that ranks pretty well for some bigger keywords for their style of food. With them wanting two sites, i'm struggling on whether we should just build them all within one site and just use separate folders on that site restaurant.com/location1 & restaurant.com/location2 with a landing page sending you to each, or if we should split it into subdomains. The content will be roughly the same, the menus are identical, i think each branch is just owned by a different family member so they want their own site. I keep leaning towards building it all into one site but i'm not sure. Any ideas?
Intermediate & Advanced SEO | | insitemoz10 -
Local Listing Question
We will be starting local SEO efforts on a medical practice that has 4 locations & 15 doctors each location (so 60 listings total). I will submit each doctor & each location to InfoGroup, LocalEze, Axciom & Factual. Also, I will only submit each location (not doctors) to Google. The problem I'm seeing is the fact that each listing would have the same exact phone number - it all goes to one main routing center. What kind of problems could come of this? Do we need a separate phone numbers for each of the four locations (at the very least)?
Intermediate & Advanced SEO | | JohnWeb120 -
Htaccess 301 regex question
I need some help with a regex for htaccess. I want to 301 redirect this: http://olddomain.com/oldsubdir/fruit.aspx to this: https://www.newdomain.com/newsubdir/FRUIT changes: different protocol (http -> https) add 'www.' different domain (olddomain and newdomain are constants) different subdirectory (oldsubdir and newsubdir are constants) 'fruit' is a variable (which will contain only letters [a-zA-Z]) is it possible to make 'fruit' UPPER case on the redirect (so 'fruit' -> 'FRUIT') remove '.aspx' I think it's something like this (placed in the .htaccess file in the root directory of olddomain): RedirectMatch 301 /oldsubdir/(.*).aspx https://www.newdomain.com/newsubdir/$1 Thanks.
Intermediate & Advanced SEO | | scanlin0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Archive or no archive?... That is the question!
When running a classified site, what is best practice for what to do with expired ads? Should they stay on the site with a sold stamp perhaps? Or should they be moved to an archive subdomain, with the original URL 301 redirecting to the new archive ad? I'm kinda thinking the second option but I suppose the only issue with this is you would have to have a consistent flow of new ads on the site to prevent categories from getting too thin. Thoughts on this and any other/better solutions would be much appreciated. Thanks.
Intermediate & Advanced SEO | | Sayers0 -
Question about "launching to G" a new site with 500000 pages
Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!
Intermediate & Advanced SEO | | azaiats20