Crawl Diagnostics Report 500 erorr
-
How can I know what is causing my website to have 500 errors and how I locate it and fix it?
-
500 errors could be caused by a mulitude of reasons, and for the non-technical they can be very hard to track down and fix.
The first thing I would look at is if it's a repeating problem in Google Webmasters Tools, or a one-time issue. These errors will show up in GWT for a long time - but if it's not a repeating problem it probably is nothing you need to worry about.
Wait, I assumed you found the problems in GWT, when you may have possibly found them using the SEOmoz crawl report. Either way, you should probably log into Google Webmaster Crawl Errors report and see if Google is experiencing the same problems.
Sometimes 500 errors are caused by over-aggressive robots and/or improperly configured servers that can't handle the load. In this case, a simple crawl delay directive in your robots.txt file may do the trick. It would look something like this:
User-agent: * Crawl-delay: 5
This would request that robots wait at least 5 seconds between page requests. But note, this doesn't necessarly solve the problem of why your server was returning 500s in the first place.
You may need to consult your hosting provider for advice. For example, Bluehost has this excellent article on dealing with 500 errors from their servers: https://my.bluehost.com/cgi/help/594
Hope this helps! Best of luck with your SEO.
-
Thank you Corey for your advise, I see which links it is in google webmasters and in , but I can't reproduce it and don't know whats the best way to fix it?
-
Thomas thank you so much for your advise, and Keri thanks for offering help.
My problem is that when I try to reproduce the 500 error so the host cant help me on how to fix it.
Any help?
-
Hey Keri how are you merry Christmas, I believe that 500 errors are almost always server related errors and unless he tells me about the host or Some other maybe strange unique problem with the computers registry I don't have enough to go on. You be interesting to find out what it is all the best, Tom
-
Hi Yoseph,
Did you get this figured out, or would you still like some assistance?
-
HTTP Error 500 is an Internal Server Error. It's a server-side error, that means there's either a problem with your web server or the code that it's trying to interpret. It may not happen in 100% of scenarios, so you may not always see it happening yourself, but it prevents the page from loading. Obviously, that's bad for search engines and users.
Your best bet in tracking down this error would be to go through your web server's error logs. Or, if you can replicate this happening on the web, you could enable error reporting, and see what errors pop up there. That should tell you how to fix the issue, whatever it may be.
-
I have googled it for you and I definitely think you should contact your web host. Here's what comes up https://my.bluehost.com/cgi/help/594
-
go into the campaign section on seomoz run your site through it. You will then see where the errors are upon seeing error lit up click it use the drop-down to select 500 errors then you will see exactly what link is causing the error.
There is literally no way I can guess what is causing your website guess not to work correctly however a 500 error is a very serious one most likely involving a problem with server.
If you give me your domain I might be able to help more however if your site is just giving 500 errors you might want to call your web host as it sounds like it is not an SEO problem is much as it is a hosting issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Crawl friendliness for large site
After watching Rand's video I am trying to think of the best way to make my large site more crawl friendly. Background I have a large site with over 100k product skus and so when you get to a particular page of products there are tons of different refinements and options that help you sort the products. Most of these are noindex followed, but I was wondering if I should be nofollowing the internal links as well in order to keep bots out of those pages and going to the pages that I want them to go too. Is this a good way to handle it? Also, does anyone have good recommendations of links to posts that deal with helping the crawl friendliness of a large site? Thanks!
Technical SEO | | Gordian0 -
Bogus Crawl Errors in Webmaster Tools?
I am suddenly seeing a ton of crawl errors in webmaster tools. Almost all of them are URL links coming from scraper sites.that I do not own. Do you see these in your Webmaster Tools account? Do you mark them as "fixed" if they are on a scraper site? There are waaaay too many of these to make redirects. Thanks!
Technical SEO | | EGOL0 -
Crawl completed but still says meta description missing
Before the last crawl I went through and made sure all meta descriptions were not missing. However, from the last crawl on the 26th July, it still has the same pages on the list and showed they were crawled as well? Any ideas on why they may still be showing as missing?
Technical SEO | | HamiltonIsland0 -
Duplicated content in moz report due to Magento urls in a multiple language store.
Hi guys, Moz crawl is reporting as duplicated content the following urls in our store: http://footdistrict.com and http://footdistrict.com?___store=footdistrict_es The chain: ___store=footdistrict_es is added as you switch the language of the site. Both pages have the http://footdistrict.com" /> , but this was introduced some time after going live. I was wondering the best action to take considering the SEO side effects. For example: Permanent redirect from http://footdistrict.com?___store=footdistrict_es to http://footdistrict.com. -> Problem: If I'm surfing through english version and I switch to spanish, apache will realize that http://footdistrict.com?___store=footdistrict_es is going to be loaded and automatically it will redirect you to http:/footdistrict.com. So you will stay in spanish version for ever. Deleting the URLS with the store code from Google Web Admin tools. Problem: What about the juice? Adding those URL's to robots.txt. Problem: What about the juice? more options? Basically I'm trying to understand the best option to avoid these pages being indexed. Could you help here? Thanks a lot.
Technical SEO | | footd0 -
First Crawl Report
Just joined SEOMoz today and am slightly overwhelmed, but excited about learning loads from it. I've just received my Crawl Report and there is a
Technical SEO | | iainmoran
404 : UserPreemptionError:
http://www.iainmoran.com/comments/feed/ This is a WordPress site and I've no idea what the best course of action to take. I've done some searching on Google and a couple of sites suggest removing that url from within the robots.txt file. I'm using the Yoast Plugin which apparently creates a robots.txt file, but I can't see any way to edit it. Is there another solution for resolving the 404 error? Many thanks, Iain.0 -
Googlebot Crawl Rate causing site slowdown
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks
Technical SEO | | SuperMikeLewis0 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0 -
Google crawl rate almost zero since re-launch, organic search up 50% though!
We're confused as to why Google's crawl of our site has dropped hugely since our new site went live. The URLs of almost all pages changed, and were 301d to the new site. About 20% of our pages were blocked by robots.txt for the re-launch. The re-launch has been great for organic search, with hits up about 50%. Yet our new content is taking a lot longer to get indexed than before. Our KB downloaded a day according to webmaster tools are well down, as is time spent downloading a page. Any ideas as to why this is?i7hwX.png
Technical SEO | | soulnafein0