Crawl Depth improvements
-
Hi
I'm checking the crawl depth report in SEM rush, and looking at pages which are 4+ clicks away.
I have a lot of product pages which fall into this category. Does anyone know the impact of this? Will they never be found by Google?
If there is anything in there I want to rank, I'm guessing the course of action is to move the page so it takes less clicks to get there?
How important is the crawl budget and depth for SEO? I'm just starting to look into this subject
Thank you
-
Hey Becky,
Those pages will be found by Google if you have links pointing to them somewhere on your site. In terms of crawl budget, the more page depth the more time does Google need to spend on crawling your site.
However, with proper internal linking you should be able to significantly lower the amount of clicks. So the next step would be adding some links through relevant anchor texts. After you do this, watch the analytics and let me know if it had any impact.
Hope it helps. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed. I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page. The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling. Which is the better practice?
Intermediate & Advanced SEO | | AspenFasteners1 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Crawl budget
I am a believer in this concept, showing google less pages will increase their importance. here is my question: I manage a website with millions of pages, high organic traffic (lower than before). I do believe that too many pages are crawled. there are pages that I do not need google to crawl and followed. noindex follow does not save on the mentioned crawl budget. deleting those pages is not possible. any advice will be appreciated. If I disallow those pages I am missing on pages that help my important pages.
Intermediate & Advanced SEO | | ciznerguy2 -
Need assistance in improving SEO of website
Dear SEO Expert, We run a website www.guitarmonk.com. Moz has told us of some errors at our website viz.: Especially duplicate content etc and whatever in addition you suggest should be good for the website for certain relative important keywords. Regards
Intermediate & Advanced SEO | | Guitarmonk0 -
Significant Google crawl errors
We've got a site that continuously like clockwork encounters server errors with when Google crawls it. Since the end of last year it will go a week fine, then it will have two straight weeks of 70%-100% error rate when Google tries to crawl it. During this time you can still put the URL in and go to the site, but spider simulators return a 404 error. Just this morning we had another error message, I did a fetch and resubmit, and magically now it's back. We changed servers on it in Jan to Go Daddy because the previous server (Tronics) kept getting hacked. IIt's built in html so I'm wondering if it's something in the code maybe? http://www.campteam.com/
Intermediate & Advanced SEO | | GregWalt1 -
How to improve ranking of deep pages?
While this may sound like an obvious or stupid question at first...let me explain... We are an e-commerce website which sells one type of item nationally; for sake of an example which is similar to us, you can think of an e-commerce site that sells movie theater tickets in cities and towns across the country. Our home page ranks very well for the appropriate keywords as well as some of our state and city pages rank very well for local searches. However, while some state and city pages rank well for their respective local searches, others have a low page rank with some not even in the top 50 for their respective keywords. My question is that we aren't clear why some pages will rank well while others wont when the competition looks similar for those local searches. And in today's Panda/Penguin era we are unsure of how to get more of these state/city pages ranking better? For the record, we are quite strict about on-page SEO, 99% of our 5600 pages are crawled & we have minimum SEO errors from the SEOMoz crawls. Can anyone provide some feedback & thoughts?
Intermediate & Advanced SEO | | CTSupp0 -
Page Crawling Check after Modification Done without staying 7 days
Page Crawling Check after Modification Done without staying 7 days. I have dome modification to my site and uploaded .so i wanna check remaining errors but Moz Crawl web site once per 7 days ,is there any way to check before that . Thank you
Intermediate & Advanced SEO | | innofidelity0