How to handle Not found Crawl errors?
-
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow.
http://www.vistastores.com/404
But, I have question about it.
Will it require to set 301 redirect for broken links which found in Google webmaster tools?
-
I agree with Ben on this one. There are plenty of 404s caused by scraper sites that don't and won't affect my time, especially on big sites.
Also, redirects aren't the only tool available. There are plenty of other ways to fix GWT 404 errors, particularly if there is a fundmental problem aside from the link in question.
-
Hi Commerce, I was certainly came across a blog post on this topic on Google's Webmaster Central blog, it covers most of the questions around 404 errors.
Generally speaking:
- If these are pages that you removed, then the 404 HTTP result code is fine.
- If these are pages that changed addresses, then you should 301 redirect to the new addresses. How you do this depends on your setup, for Apache-servers you may be able to use the .htaccess file for this.
- Unless these are pages that used to receive a lot of traffic from search, these 404s won't be the reason for your site's traffic dropping like that. Google understands that the web changes and that URLs disappear - that is not a reason for Google to stop showing your site.
So my recommendation would be to check the URLs that are listed as 404 crawl errors. If any are important, then set up redirects to the appropriate new URLs as soon as you can. If none of them are important, then keep this in mind as something worth cleaning up when you have time, but focus on the rest of your site first. Often drastic drops in traffic are due more to the general quality of the website, so that's what I'd recommend working on first.
For more deatails refer to How to Fix Crawl Errors.
I hope that your query had been solved.
-
Makes sense - in which case the homepage might not be the best place for you.
Another option for the custom 404 which works well in certain circumstances is to add a dynamic element to it.
For example, we know the referring URL has reference to product XYZ which may now be unavailable, but perhaps we can dynamically pull in other relevant products into the 404 page.
Thats something I am looking to do with hotels that become unavailable - pull in a dynamic element to the 404 which basically recommends some other hotels close by.
-
Well I would have to disagree with that principal. Sometimes you have to think a little broader than just SEO and ask yourself if it really makes commercial sense to redirect everything.
That's why I put a financial cost against each unique redirect. At the end of the day it requires someone to action it and that person has a cost associated with their time that may be better allocated working on something that will actually drive business uplift or improve customer experience.
Each to their own of course, but I see a lot of SEO's who don't think big picture and they up using up developer resource doing stuff that then has no impact. It just p!sses people off in my experience.
-
Hi Ben,
I agree with you that some links are not worth redirecting. However, in my experience a dead link never comes alone. Often there is some kind of reason that the link was created, and there might be others you don't know about.
For this reason I usually recommend redirecting all broken links, even if the individual link is not worth the trouble. Obviously there are exceptions to this rule, but most of the time it's worth your trouble.
Sven
-
Good to know! But, I have very bad experience to redirect such a strong page to home page. I have removed too many product pages for market umbrellas from my website and redirect it to home page. Because, I don't have specific landing page or inner level page for it. So, I'm able to see change over ranking for specific keywords. My home page is ranking well in Market Umbrellas keyword because too many external page link my product page with that keyword. It also create negative ranking impression for my actual targeted keyword which I'm using for my home page.
-
Yeah, which is basically what Kane is saying as well. If you don't have an appropriate internal page then you could send the 301 redirect to your homepage or if it was a specific product you might want to redirect it to the parent/child category.
If its a particularly strong URL that has been linked to from many good external sources then you might consider adding a replacement content page and redirecting to that.
Ben
-
Hi Ben,
I got your point. If my page is available on external page which have good value (Good page rank or heavy amount of traffic) so, I need to redirect it on specific internal page to save my page rank flow. Right?
-
Hopefully I am understanding your question correctly here....
The main benefit of the custom 404 page aside from the obvious improvement to user experience is that you provide additional links into content that otherwise wouldn't necessarily be available to the search bots.
In essence if you just had a standard 404 error page you'd send the search bots to a dead page where their only decision would be to leave the domain and go elsewhere.
Regards setting up 301 redirects I like to associate a cost to each 301 redirect. Imagine the time it will take you or someone else to set each redirect up (say $5 per redirect). Then consider the following:
Is the URL that is 404 worth redirecting?
(1) Does it hold some residual SEO value (i.e., is it present on external sites that is driving link equity? if so can you redirect that equity to somewhere more valuable?
(2) Is the URL present on an external site driving referral traffic? if so do you have a new content page that will still match the users intent?
if the URL(s) that are 404'ing have no real link equity associated to them and/or you don't have a genuinely useful page to redirect the user to then I would just let them hit the 404 page.
If in doubt put yourself in a users boots and ask yourself if the set-up you have done would offer a valuable experience? no point redirecting a user to something totally irrelevant to the original intent - it'll just p!ss them off most the time and increase your bounce rate.
-
If there is a link pointed at that 404 page, then I will almost always 301 it to regain that link value. If I control the source of the link, I'll change that instead. If the link is from a spammy or junky website, I don't worry about it.
Here is a worthwhile article on how to go about fixing GWT crawl errors:
http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
I would suggest adding more content to your 404 page. Try to help people find what they're looking for by suggesting common pages, product segments, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
403 Errors Issue
Hi, all! I've been working with a Wordpress site that I inherited that gets little to no organic traffic, despite being content rich, optimized, etc. I know there's something wrong on the backend but can't find a satisfactory culprit. When I emulate googlebot, most pages give me a 403 error. Also, google will not index many urls which makes sense and is a massive headache. All advice appreciated! The site is https://www.diamondit.pro/ It is specific to WP Engine, using GES (Global Edge Security) and WPWAF
Technical SEO | | SimpleSearch0 -
Why Google crawl parameter URLs?
Hi SEO Masters, Google is indexing this parameter URLs - 1- xyz.com/f1/f2/page?jewelry_styles=6165-4188-4184-4192-4180-6109-4191-6110&mode=li_23&p=2&filterable_stone_shapes=4114 2- xyz.com/f1/f2/page?jewelry_styles=6165-4188-4184-4192-4180-4169-4195&mode=li_23&p=2&filterable_stone_shapes=4115&filterable_metal_types=4163 I have handled by Google parameter like this - jewelry_styles= Narrows Let Googlebot decide mode= None Representative URL p= Paginates Let Googlebot decide filterable_stone_shapes= Narrows Let Googlebot decide filterable_metal_types= Narrows Let Googlebot decide and Canonical for both pages - xyz.com/f1/f2/page?p=2 So can you suggest me why Google indexed all related pages with this - xyz.com/f1/f2/page?p=2 But I have no issue with first page - xyz.com/f1/f2/page (with any parameter). Cononical of first page is working perfectly. Thanks
Technical SEO | | Rajesh.Prajapati
Rajesh0 -
What is the best way to handle these duplicate page content errors?
MOZ reports these as duplicate page content errors and I'm not sure the best way to handle it. Home
Technical SEO | | ElykInnovation
http://myhjhome.com/
http://myhjhome.com/index.php Blog
http://myhjhome.com/blog/
http://myhjhome.com/blog/?author=1 Should I just create 301 redirects for these? 301 http://myhjhome.com/index.php to http://myhjhome.com/ ? 301 http://myhjhome.com/blog/?author=1 to http://myhjhome.com/ ? Or is there a better way to handle this type of duplicate page content errors? and0 -
Need Advice: How should we handle this situation?
Hi Folks, We have a blog post on one of our sites that ranked very highly for lucrative term for about a period of two months. It had over 2000 Facebook likes, about 20 tweets and the same amount of Google +1's. The post ended up receiving several high quality natural links, and we also pointed a few authoritative links to it from our network of sites. After we saw the ranking starting to slip we did a bit of link building (which we shouldn't have done) and ended up making a big mistake. The link building company was only supposed to do 30 links and they ended up doing 600. Once we figured it out, we immediately submitted a disavow request and told Google about our mistake. I also thought maybe we then had a manual spam penalty applied so I also submitted a reconsideration request (and also told them about our mistake) but got back a canned reply saying "no manual penalties" were found. After we did all that, we saw the rankings fall out of the top 50 with the next 10 days. I'm confident we can throw up a new similar blog post and see close the same rankings we experienced with the original post. But before I do that, I have two questions: Should we 301 the old post to the new post? Could that some how "pass" the bad rankings along to the new post? What should we do about the natural links we received? Should we try and reach out to the sites and get them to change their links to the new post? Any help would be appreciated. Thanks!
Technical SEO | | shawn810 -
How to get found on local google search?
Hey When look for particular local businesses on Google like "Manchester Hotels" for example; at the top of Google page come all business up that are marked on the city map (with A,B,C..), then all the others follow. So my question is: "How can I get my business on that map?". Thank you! Ve
Technical SEO | | MissVe0 -
Importance of correction of technical errors
Hello everyone!!! I have question that i know it has been asked so many times. However i am looking for an idea for my specific situation. I own a website about commercial steel. My main focus has been getting incoming links from important companies and sites, while maintaining a good quality site. Ive been struggling with ranks and Page Authority. Ive never put attention to technical errors such as Duplicate Content, 4XX Errors and critical warnings such as Redirects. I have around 70 errors and around 400 warnings. Someone told me that as long as the website is "user friendly" i should worry about that. I have scarce resources to my SEO efforts. Which aspect should i put more effort?. Link Building and Quality Content vs Technical SEO ??? Is there a recommended balance mix towards a better PA, DA and Overall Quality?? I know is difficult, but it would be extremely helpful to hear from you!! Regards.
Technical SEO | | JesusD0 -
4XX Errors - Adding %5c%5c to Links
Hi all 😃 Hope someone can help me with this. The internal links on my hubby's business site occasionally break and add %5c%5c%5c endlessly to the end of the url - like this: site.com/about/hours-of-operation/\\\\\\\\% I cannot for the life of me figure out why it is doing this and while it has happened to me from time to time, I can't recreate it. My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts. Any insight? Thank you!! Jennifer ~PotPieGirl
Technical SEO | | potpiegirl0 -
Can Search engines crawl this page
Hi guys, To put a long story short we have had to make a copy of our site and put it on another domain so in essence there are 2 copies of our site on the web.What we have done is put a username and password on the homepage - http://www.ughhwiki.co.uk/ now i just want to be 100% sure that the search engines cannot crawl this? Thank you Jon
Technical SEO | | imrubbish0