How to handle Not found Crawl errors?
-
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow.
http://www.vistastores.com/404
But, I have question about it.
Will it require to set 301 redirect for broken links which found in Google webmaster tools?
-
I agree with Ben on this one. There are plenty of 404s caused by scraper sites that don't and won't affect my time, especially on big sites.
Also, redirects aren't the only tool available. There are plenty of other ways to fix GWT 404 errors, particularly if there is a fundmental problem aside from the link in question.
-
Hi Commerce, I was certainly came across a blog post on this topic on Google's Webmaster Central blog, it covers most of the questions around 404 errors.
Generally speaking:
- If these are pages that you removed, then the 404 HTTP result code is fine.
- If these are pages that changed addresses, then you should 301 redirect to the new addresses. How you do this depends on your setup, for Apache-servers you may be able to use the .htaccess file for this.
- Unless these are pages that used to receive a lot of traffic from search, these 404s won't be the reason for your site's traffic dropping like that. Google understands that the web changes and that URLs disappear - that is not a reason for Google to stop showing your site.
So my recommendation would be to check the URLs that are listed as 404 crawl errors. If any are important, then set up redirects to the appropriate new URLs as soon as you can. If none of them are important, then keep this in mind as something worth cleaning up when you have time, but focus on the rest of your site first. Often drastic drops in traffic are due more to the general quality of the website, so that's what I'd recommend working on first.
For more deatails refer to How to Fix Crawl Errors.
I hope that your query had been solved.
-
Makes sense - in which case the homepage might not be the best place for you.
Another option for the custom 404 which works well in certain circumstances is to add a dynamic element to it.
For example, we know the referring URL has reference to product XYZ which may now be unavailable, but perhaps we can dynamically pull in other relevant products into the 404 page.
Thats something I am looking to do with hotels that become unavailable - pull in a dynamic element to the 404 which basically recommends some other hotels close by.
-
Well I would have to disagree with that principal. Sometimes you have to think a little broader than just SEO and ask yourself if it really makes commercial sense to redirect everything.
That's why I put a financial cost against each unique redirect. At the end of the day it requires someone to action it and that person has a cost associated with their time that may be better allocated working on something that will actually drive business uplift or improve customer experience.
Each to their own of course, but I see a lot of SEO's who don't think big picture and they up using up developer resource doing stuff that then has no impact. It just p!sses people off in my experience.
-
Hi Ben,
I agree with you that some links are not worth redirecting. However, in my experience a dead link never comes alone. Often there is some kind of reason that the link was created, and there might be others you don't know about.
For this reason I usually recommend redirecting all broken links, even if the individual link is not worth the trouble. Obviously there are exceptions to this rule, but most of the time it's worth your trouble.
Sven
-
Good to know! But, I have very bad experience to redirect such a strong page to home page. I have removed too many product pages for market umbrellas from my website and redirect it to home page. Because, I don't have specific landing page or inner level page for it. So, I'm able to see change over ranking for specific keywords. My home page is ranking well in Market Umbrellas keyword because too many external page link my product page with that keyword. It also create negative ranking impression for my actual targeted keyword which I'm using for my home page.
-
Yeah, which is basically what Kane is saying as well. If you don't have an appropriate internal page then you could send the 301 redirect to your homepage or if it was a specific product you might want to redirect it to the parent/child category.
If its a particularly strong URL that has been linked to from many good external sources then you might consider adding a replacement content page and redirecting to that.
Ben
-
Hi Ben,
I got your point. If my page is available on external page which have good value (Good page rank or heavy amount of traffic) so, I need to redirect it on specific internal page to save my page rank flow. Right?
-
Hopefully I am understanding your question correctly here....
The main benefit of the custom 404 page aside from the obvious improvement to user experience is that you provide additional links into content that otherwise wouldn't necessarily be available to the search bots.
In essence if you just had a standard 404 error page you'd send the search bots to a dead page where their only decision would be to leave the domain and go elsewhere.
Regards setting up 301 redirects I like to associate a cost to each 301 redirect. Imagine the time it will take you or someone else to set each redirect up (say $5 per redirect). Then consider the following:
Is the URL that is 404 worth redirecting?
(1) Does it hold some residual SEO value (i.e., is it present on external sites that is driving link equity? if so can you redirect that equity to somewhere more valuable?
(2) Is the URL present on an external site driving referral traffic? if so do you have a new content page that will still match the users intent?
if the URL(s) that are 404'ing have no real link equity associated to them and/or you don't have a genuinely useful page to redirect the user to then I would just let them hit the 404 page.
If in doubt put yourself in a users boots and ask yourself if the set-up you have done would offer a valuable experience? no point redirecting a user to something totally irrelevant to the original intent - it'll just p!ss them off most the time and increase your bounce rate.
-
If there is a link pointed at that 404 page, then I will almost always 301 it to regain that link value. If I control the source of the link, I'll change that instead. If the link is from a spammy or junky website, I don't worry about it.
Here is a worthwhile article on how to go about fixing GWT crawl errors:
http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
I would suggest adding more content to your 404 page. Try to help people find what they're looking for by suggesting common pages, product segments, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle SEO redirection after remove www
Recently our https://bestclassifiedsusa.com free classifieds USA website we move from www to non www format. Now old urls still showing in Google SERP. When click they show 404 error pages due to removed old ads.
Technical SEO | | bcuclassifieds
We redirect www to non www version for main domain but still old cache pages affected.
How to dynamically handle this case, all old www version ads pages auto redirect to relevant category pages?0 -
Weird, long URLS returning crawl error
Hi everyone, I'm getting a crawl error "URL too long" for some really strange urls that I'm not sure where they are being generated from or how to resolve it. It's all with one page, our request info. Here are some examples: http://studyabroad.bridge.edu/request-info/?program=request info > ?program=request info > ?program=request info > ?program=request info > ?program=programs > ?country=country?type=internships&term=short%25 http://studyabroad.bridge.edu/request-info/?program=request info > ?program=blog > notes from the field tefl student elaina h in chile > ?utm_source=newsletter&utm_medium=article&utm_campaign=notes%2Bfrom%2Bthe%2Bf Has anyone seen anything like this before or have an idea of what may be causing it? Thanks so much!
Technical SEO | | Bridge_Education_Group0 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
How should we handle re-directory links? Should we remove these links?
We are currently cleaning up bad links that were purchased by a previous SEO agency. We have found links on anonym.to pages that redirect traffic to our site automatically. How should this be handled? Should we remove these links?
Technical SEO | | Lorne_Marr0 -
Why do I get duplicate page title errors.
I keep getting duplicate page title errors on www.etraxc.com/ and www.etraxc.com/default.asp, which are both pointing to the same page. How do i resolve this and how bad is it hurting my SEO.
Technical SEO | | bobbabuoy0 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1 -
Parameter Handling - Nourls Question
We're trying to make sense of Google's new parameter handling options and I seem unable to find a good answer to an issue regarding the NoUrl option. For ex. we have two Urls pointing to the same content: http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map?zoom=1&x=0.518&y=0.3965 http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map?zoom=2&x=0.518&y=0.3965 Ideally, I would want Google to index only the main Url without any parameters, so http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map To do this, I would set the value No Urls for the zoom, x and y parameters. By doing this do we still get any SEO value from back links that point to the URLs with the parameters, or will Google just ignore them?
Technical SEO | | propertyshark0 -
Meta description tag missing in crawl diagnostics
Each week I've been looking at my crawl diagnostics and seomoz still flags a few pages with missing meta description although they are definitely in there. Any ideas on why this would be happening?
Technical SEO | | British_Hardwoods0