Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
-
I haven't looked into this in a while, it used to be that you didn't want to bury pages beyond three clicks from the main page. What is the rule now in order to have deep pages indexed?
-
Great, thanks everybody.
-
Google prioritized by the importance content plays on your site (i.e. how prominent it is in your navigation and hierarchy) but given time ... they crawl as much of your site as possible.
So the short answer is no from a crawling standpoint but from a ranking standpoint ... it's a serious consideration. Of course, if you link to all your pages just to push them up then it's a nightmare for visitors and you dilute the PageRank flow for the key pages so it's a balancing act.
-
That's still a good rule of thumb. The easier it is to get to, the more likely that Google will index and rank the page.
In Rand's Visual Guide to Keyword Targeting and On-Page Optimization he suggests that a well optimized page should be reachable in no more than 4 clicks from any other page on the site. I would assume that means that a well optimized page should be within 3 clicks of the homepage.
-
It's my experience Google indexes everything on your site, over time. Logic is to place most important pages in the upper tiers because googlebot won't get everything at once.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
Best and easiest Google Depersonalization method
Hello, Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore. What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct. Thanks
Algorithm Updates | | BobGW0 -
Can Google penalize a country keyword
Hello again guys Thank you for your previous help with www.kids-academy.co.uk - we are slowly getting there! I wanted to ask something I cannot seem to find an answer to, can Google penalize you by country? By this I mean; Search term
Algorithm Updates | | LeanneSEO
Nursery franchise UAE Page 1
Nursery franchise UK Nowhere to be found! The page in question (well a section of the site) has been optimised for UK, however, as they do have a sister site in the UAE, it mentions those areas too. The pages I have been working on are now ranking reasonably well to say there is a long way to go, but for long tailed keywords NOT including anything to do with the UK. There are no naughty backlinks with the anchor text to do with the UK, the server is hosted in the UK, it is a .co.uk URL (no geotagging but I would like to know if this is of any use with this type of URL, everything says no, but it cant harm can it?) - is it possible Google due to bad practices in the past have slapped a penalty on the specific keyword area? Not something I have come across previously but I am scratching my head over here! Time for a brew break 😄 Thanks in advance guys! Leanne1 -
Drop in Page Indexing, Small rise in Search Queries
Hello, I have a news based website so i am creating multiple new posts daily. I changed a lot of the site and got rid of old potentially duplicate content back in feb and had a sharp drop in pages indexed. I know this was because I removed a lot of pages though. However I still have a good 20,000 + pages on my site and my indexing has dropped a further three times since then. From 9,000 to 2,000 a coupe of months ago and then slowly down since April to just 133. It doesn't seem to have affected my search queries yet but surely will if it continues. I am really confused as to how this might happen & how to turn it around. We dont use any dodgy SEO tricks either.
Algorithm Updates | | luwhosjack0 -
Domain Change: Leave The Old Domain Homepage Up
We are going to be redesigning our website and switching to a new domain. I think we will set up a permanent 301 redirect from each page of the old domain to a page on the new domain. We would like to leave the old domain homepage up with all content removed and have a link pointing to the new domain. Is there any SEO harm to leaving the old domain homepage up? Thank you! Jessie
Algorithm Updates | | JessieT0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0