Do search engines crawl links on 404 pages?
-
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl.
Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
-
Okay, thanks Alan!
-
Hi Brad
Sorry I have only just come back to you - it was late night here in the UK, but it looks like Alan has already answered your question
Have you tested your 404 page with fetch as Google in webmaster tools - you should see that it can see the links on your 404 page and as such will continue crawling them as Alan has said.
So what is a benefit to a user will also be a benefit to Google crawling your site in my opinion
-
Sorry, yes, it should crawl the links - they used to do that.
But you can prove it to yourself, by doing what I said - and then report back.
-
Yes it will continue crawling or yes it will stop the crawl?
-
Yes and you can test it by creating a page that is linked from nowhere else and then check your logs or analytics
-
Hey Matt,
Thanks for the reply. I'm aware of all the best practice stuff but thanks for sending through. It didn't quite answer my question so let me rephrase...
Will a bot follow a hyperlink (like the example below) on a 404 page or will it stop the crawl on that page (not on the whole site) because the header response code is a 404?
-
Hi Brad
Firstly it is great from a usability point of view to have a custom 404 page and I would link it to your most popular content and maybe add a search feature on the page for your site to help find the content that is missing. I have come across some nice 404s that actually have very concise sitemap in order to help the visitor navigate the site.In order to prevent Google from indexing your 404 page you need to make sure it returns an actuall 404 HTTP status code.
In order to understand how Goolgebot crawls your site I would look at the following post from Google themselves - https://support.google.com/webmasters/answer/182072?hl=en
Rather than being concerned about a 404 page having links on to keep the crawl going make sure you have an XML sitemap that you have submitted to Google via Webmaster Tools as this will help your crawl process.
Googlebot alots a set amount of time to crawling your site and it doesn't just stop crawling because it encounters a 404 error. However make sure that you monitor Google Webmaster Tools and take care of any reported 404s with 301 redirects for instance if the page has changed location. You will notice that Googlebot reports 404 erros on the days it finds them and these can often be multiple 404 errors encountered in one visit to your site by Googlebot. Keeing an eye on this and making sure you keep it updated will make your site as crawl efficient as possible which is clearly what you are after - as we all are
I thought this would also be interesting reading in relation to this - http://googlewebmastercentral.blogspot.co.uk/2011/05/do-404s-hurt-my-site.html
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Onsite search engines that are SEO friendly - which do you recommend
Hi - I am seeking an onsite search engine that is SEO friendly - which do you recommend? And has anyone tried doofinder.com - that specific search engine - if you have, is it well aligned/attuned to the SEO aspects of your site? Thanks as ever, Luke
Intermediate & Advanced SEO | | McTaggart1 -
Internal links from homepage and other pages
Hello, I'm curious what the difference is between internal links from the homepage and category pages. Make it sense to give some internal links from category pages (with a high PA) to an another page for a boost in the search results? Or is the link value too low in this case? Thanks in advance,
Intermediate & Advanced SEO | | MarcelMoz
Marcel1 -
Im scoring 100% in the page optimization, wht else I need to do, because I rank 7-12 in search results
Hi All, Pls check the below url http://www.powerwale.com/inverter-battery for inverter battery keyword in google.co.in im scoring 100% in the page optimization, wht else I need to do, and also I still rank in between 7 to 12 in search results.. How can be in Top 3 search results.. Pls suggest.. Thanks
Intermediate & Advanced SEO | | Rahim1191 -
How to fix Invalid Product Page registering as Soft 404
Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?
Intermediate & Advanced SEO | | ntsupply0 -
How many links would you need to rank up in page rank?
White hat **** Can 20 website with page rank of 3 make your site rank higher?
Intermediate & Advanced SEO | | spidersite0 -
Why do pages with a 404 error drop out of webmaster tools only to reappear again?
I have noticed a lot of pages which have fallen out of webmaster tools crawl error log that had bee 404'ing are reappearing again Any suggestions as to why this might be the case? How can I make sure they don't reappear again?
Intermediate & Advanced SEO | | Towelsrus0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0