SEOMOZ Diagram question
-
Hi,
On this SEOMOZ help page (http://www.seomoz.org/learn-seo/internal-link) the diagram explaining the optimal link structure (image also attached) has me a little confused.
From the homepage, if the bot crawls down the right-hand link first, will it not just hit a dead end where it cant crawl any further and disappear?
OR... will it hit the end of the structure and then crawl backwards to the homepage again and follow down another link and then just repeat the process until all pages are indexed?
Cheers
-
In a vacuum, yes. However hopefully you'll be linking in and out anyway. Like most things in SEO it is good to understand the principal without being a slave to it.
If one area is picking up lots of links then fantastic. You could link back around the site to spread that link equity. Better still - try to ensure it is your money pages that are getting the incoming links!
-
Great reply thanks very much, that made sense.
This is the optimal structure SEO wise but from a user experience point of view not the best, what kind of problems would interlinking level 2 cause?
Also if level 3 on the left somehow picked up lots of inbound links are you not locking juice into 1 silo?
I have read a little about this and Rand mentions interlinking where relevant to unlock some of this juice and pass it about a little across silos.
But then do you not just end up with what you was trying not to do in the first place?
Thanks again for the great reply.
-
Nice question.
Search engine bots are many headed beasts. When they read a page they will note what links are on that page and add them to their list to crawl. They might then follow several or them (or none at all) and come back later and start with the next URL on their list.
Instead of thinking of the bot like a visitor who is deciding where to go next, think of pouring sand in to the top. It'll flow down every connected route.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect question | new blog install on subdomain
Hi, I am running a wordpress site and our blog has grown to have a bit of a life of its own. I would like to use a more blog-oriented wordpress theme to take advantage of features that help with content discoverability, which is what the current theme I'm using doesn't really provide. I've seen sites like Canva, Mint and Hubspot put their blog on a subdomain, so the blog is sort of a separate site within a site. Advantages I see to this approach: Use a separate wordpress theme Help the blog feel like its own site and increase user engagement Give the blog its own name and identity My questions are: Are there any other SEO ramifications for taking this approach? For example, is a subdomain (blog.mysite.com) disadvantageous somehow, or inferior to to mysite.com/article-title? Regarding redirects, I read a recent Moz article about how 301s now do not lose page rank. I would also be able to implement https when I redirect, which is a plus. Is this an ok approach? Assuming I have to create redirect rules manually for each post though Thanks!
Intermediate & Advanced SEO | | mikequery0 -
SEO Menu Question
I have a question regarding to the SEO benefits of different types of menus. Recently, I have noticed an increasing number of websites with the sort of menu like at www.sportsdirect.com, where there is only one main dropdown and then everything is a sub-menu of the sub-menus if that makes sense. Is this approach more, less or equal beneficial to what you see at http://www.wiggle.co.uk/ where there are multiple initial dropdown menus? Appreciate the feedback.
Intermediate & Advanced SEO | | simonukss0 -
Robots.txt question
I notice something weird in Google robots. txt tester I have this line Disallow: display= in my robots.text but whatever URL I give to test it says blocked and shows this line in robots.text for example this line is to block pages like http://www.abc.com/lamps/floorlamps?display=table but if I test http://www.abc.com/lamps/floorlamps or any page it shows as blocked due to Disallow: display= am I doing something wrong or Google is just acting strange? I don't think pages with no display= are blocked in real.
Intermediate & Advanced SEO | | rbai0 -
Canonical questions
Hi, We are working on a site that sells lots of variations of a certain type of product. (Car accessories) So lets say there are 5 products but each product will need a page for each car model so we will potentially have a lot of variations/pages. As there are a lot of car models, these pages will have pretty much the same content, apart from the heading and model details. So the structure will be something like this; Product 1 (landing page) Audi (model selection page)
Intermediate & Advanced SEO | | davidmaxwell
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
---BMW 1 Series (Model detail page)
---BMW 3 Series (Model detail page) Product 2 (landing page) Audi (model selection page)
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
etc
etc The structure is like this as we will be targeting each landing page for AdWords campaigns. As all of these pages could look very similar to search engines, will simply setting up each with a canonical be enough? Is there anything else we should do to ensure Google doesn't penalise for duplicate page content? Any thoughts or suggestions most welcome.
Thanks!0 -
High level rel=canonical conceptual question
Hi community. Your advice and perspective is greatly appreciated. We are doing a site replatform and I fear that serious SEO fundamentals were overlooked and I am not getting straight answers to a simple question: How are we communicating to search engines the single URL we want indexed? Backstory: Current site has major duplicate content issues. Rel-canonical is not used. There are currently 2 versions of every category and product detail page. Both are indexed in certain instances. A 60 page audit has recommends rel=canonical at least 10 times for the similar situations an ecommerce site has with dupe urls/content. New site: We are rolling out 2 URLS AGAIN!!! URL A is an internal URL generated by the systerm. We have developed this fancy dynamic sitemap generator which looks/maps to URL A and creates a SEO optimized URL that I call URL B. URL B is then inserted into the site map and the sitemap is communicated externally to google. URL B does an internal 301 redirect back to URL A...so in an essence, the URL a customer sees is not the same as what we want google to see. I still think there is potential for duplicate indexing. What do you think? Is rel=canonical the answer? In my research on this site, past projects and google I think the correct solution is this on each customer facing category and pdp: The head section (With the optimized Meta Title and Meta Description) needs to have the rel-canonical pointing to URL B
Intermediate & Advanced SEO | | mm916157
example of the meta area of URL A: What do you think? I am open to all ideas and I can provide more details if needed.0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Privacy Policy & T&C's SEO related question
With Adwords they request a Privacy Policy and T&C's sometimes for an Ad to be approved. Silly question I know but do you think Google looks out for pages like this to identity websites which are more genuine for organic? Thanks
Intermediate & Advanced SEO | | activitysuper0 -
Looking to merge 2 domains to 1 - Got some questions - do you have the insight?
Hey Everyone, So the company I work for owns 2 domains. We have our main site which offers our portfolio of products and then we have a second domain, which we acquired, which focuses on one of our products (we also have this product available on our main site). So here is where things get tricky for me. This second site (the one that focuses on one of our products) has a HUGE following and a higher domain authority of 80 when our main site has an authority of 70. The higher up of the company want to merge the second popular site with our main site. There are many problems with this in my opinion since the following on this second site is very hardcore in the security space (I do not think that they will like to be sent over to a more corporate site) BUT I want to figure out the SEO value that can be gained or lost from this merge. Some questions... By 301 redirecting the pages over to our main page - I am assuming that the SEO power carries along with it so that these pages should still perform well? Will the domain authority of our main site go up with the merge since we are bringing over pages with a lot of equity? In your opinion, does it make more sense to keep the site with the higher authority since it is easier to host content that performs better? Anyone have any experience with this? SEO-wise do you think that this is a good idea or a bad idea? Thanks a lot! Pat
Intermediate & Advanced SEO | | PatBausemer0