Crawl Depth improvements
-
Hi
I'm checking the crawl depth report in SEM rush, and looking at pages which are 4+ clicks away.
I have a lot of product pages which fall into this category. Does anyone know the impact of this? Will they never be found by Google?
If there is anything in there I want to rank, I'm guessing the course of action is to move the page so it takes less clicks to get there?
How important is the crawl budget and depth for SEO? I'm just starting to look into this subject
Thank you
-
Hey Becky,
Those pages will be found by Google if you have links pointing to them somewhere on your site. In terms of crawl budget, the more page depth the more time does Google need to spend on crawling your site.
However, with proper internal linking you should be able to significantly lower the amount of clicks. So the next step would be adding some links through relevant anchor texts. After you do this, watch the analytics and let me know if it had any impact.
Hope it helps. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
How reliable is the link depth info from Xenu?
Hi everyone! I searched existing Q & A and couldn't find an answer to this question. Here is the scenario: The site is: http://www.ccisolutions.com I am seeing instances of category pages being identified as 8 levels deep. For example, this one: http://www.ccisolutions.com/StoreFront/category/B8I This URL redirects to http://www.ccisolutions.com/StoreFront/category/headphones - which Xenu identifies as being only 1 level deep. Xenu does not seem to be recognizing that the first URL 301-redirects to the second. Is this normal for the way Xenu typically reports? If so, why is the first URL indicated to be so much further down in the structure? Is this an indication of site architecture problems? Or is it an indication of problems with how our 301-redirects are being handled? Both? Thanks in advance for your thoughts!
Intermediate & Advanced SEO | | danatanseo0 -
Meta No INDEX and Robots - Optimizing Crawl Budget
Hi, Sometime ago, a few thousand pages got into Google's index - they were "product pop up" pages, exact duplicates of the actual product page but a "quick view". So I deleted them via GWT and also put in a Meta No Index on these pop up overlays to stop them being indexed and causing dupe content issues. They are no longer within the index as far as I can see, i do a site:www.mydomain.com/ajax and nothing appears - So can I block these off now with robots.txt to optimize my crawl budget? Thanks
Intermediate & Advanced SEO | | bjs20100 -
Improvement in rankings then site dissapears from SERPS?
Yesterday I saw significant improvements in rankings and the first page resulapp for several keywords appeared to have changed and included my site. Today the results appear to have reverted back and my site is nowhere to be seen. At least before I was in the top 100 but now I'm nowhere?
Intermediate & Advanced SEO | | SamCUK0 -
How can we improve rankings for category pages
Hi Everyone, I have a dog breeder site I'm working on and I was wondering if I could get some tips and ideas on things to do to help the "category" pages rank better in search engines. Let's say I have "xyz" breed category page which has listings of all dog breeders who offer that particular breed, in this case "xyz". I have certain breeder profile listings which rank higher for those terms that the category page should be ranking for. So I'm guessing Google thinks those breeder profile pages are more relevant for those terms. Especially if well optimized. I know thin content may be my problem here, but one of our competitors dominates the rankings for relevant keywords with no content on their category pages. What do you all suggest?
Intermediate & Advanced SEO | | rsanchez0 -
How to improve Google Places Account
Hello, A friend of mine has a google places account. Her website is http://debidonner.com/ You can see her places account listed under "life coach scottsdale arizona" I've attached a screenshot of her categories. I just added "NLP Coach" and "Personal Coach" today, so results are only for the other categories. We're looking for ways to improve it. We've had 43 clicks for driving directions. 4 map clicks, and 0 website clicks (how good is that, by the way) Thanks! screenshot-google-places.jpg screenshot-google-places2.jpg
Intermediate & Advanced SEO | | BobGW0 -
I'm pulling my hair out trying to figure out why google stopped crawling.. any help is appreciated
This is going to be kind of long, simply because there is a background to the domain name that is not typical to anybody in the world really and I'm not sure if its possible that it was penalized or ranked lower because of that or not. Because of that I'm going to include it with the hopes that giving the full picture some nice soul in the world who has more knowledge in this than me see's something or knows something and can point me in the right direction. Our site has been around for a few years, at one point the domain was seized by homeland security ICE, and then they had to give it back in Dec. which sparked a lot of the SOPA PIPA stuff and we became the poster child so to speak. The site had previously been up since 2008, but due to that whole mess the site was down for 13 months on the dreaded seized server with a scary warning graphic and site title which caused quite obviously a bunch of 404 errors and who knows what else damage to anything we'd had before that as far as page rank and incoming links. we had a lot of incoming links from high quality sites. We were advised upon getting the domain back to pretty much scrap all the old content that was on the site prior and just start fresh.. which we did. Googlebot started crawling slowly, but then as we started getting back into the swing of things people started linking to us,some with high page rank, we were getting indexed quite frequently and ranking high on search results in our niche.. Then something happened on March 4th, we had arguably our best day with google traffic, we'd been linked back by places like Huff Post etc for content in our niche.. and the next day literally it was a freefall. Darn near nothing. I've attached a screen shot from webmaster tools so you can see how drastic it was. I went crazy, trying to figure out what was wrong, searching obsessively through webmaster tools looking for any indication of a problem, searched the site on google site:dajaz1.com and what comes up is page 2 page 3 page 45 page 46. It's also taken to indexing our category and tag pages and even our search pages. I've now set those all to noindex follow but when I look at where the googlebots are at on the site, they're on the categories, pages, author pages, and tags. Some of our links are still getting indexed, but doing a search just of our site name and we're ranking below many of the media sites that have written about our legal issues, when a month ago we were at least top result for our own name. I've racked my brain trying to figure out the issue. I've disabled plugins, I'm on fetch as google bot all the time making sure our stuff is at least coming out as 200 (we had 2 days where we were getting 403 errors due to a super-cache issue, but once fixed googlebot returned like it never left) I've literally watched 1000 videos, read 100 forums, added in SEO plugins, tried to optimize the site to the point I'm worried I'm over doing it.. and still they've barely begun to crawl. As you can see there is some activity in the last 2-3 days, but even submitting a new site map once I changed the theme out of desperation it's only indexed 16. I've looked for errors all through webmaster tools and I can't find anything to tell me why that happened, how to fix it, and how to get googlebot to like us again. I'm pulling my hair out here. The links we have incoming are high quality links like huffington post , spin, complex, etc. Those haven't slowed down at all, we do outgoing links to sites we trust and are high quality as well. I've got interns working on how they're writing titles and such, I've gone through and attempted to fix duplicate pages and titles.. I've been going through and re-writing meta description tags What am I missing? I'm pulling my hair out trying to figure out what the issue is. Eternally grateful for any help provided. jnzb6.png
Intermediate & Advanced SEO | | malady0