Crawl Diagnostics Report 500 erorr
-
How can I know what is causing my website to have 500 errors and how I locate it and fix it?
-
500 errors could be caused by a mulitude of reasons, and for the non-technical they can be very hard to track down and fix.
The first thing I would look at is if it's a repeating problem in Google Webmasters Tools, or a one-time issue. These errors will show up in GWT for a long time - but if it's not a repeating problem it probably is nothing you need to worry about.
Wait, I assumed you found the problems in GWT, when you may have possibly found them using the SEOmoz crawl report. Either way, you should probably log into Google Webmaster Crawl Errors report and see if Google is experiencing the same problems.
Sometimes 500 errors are caused by over-aggressive robots and/or improperly configured servers that can't handle the load. In this case, a simple crawl delay directive in your robots.txt file may do the trick. It would look something like this:
User-agent: * Crawl-delay: 5
This would request that robots wait at least 5 seconds between page requests. But note, this doesn't necessarly solve the problem of why your server was returning 500s in the first place.
You may need to consult your hosting provider for advice. For example, Bluehost has this excellent article on dealing with 500 errors from their servers: https://my.bluehost.com/cgi/help/594
Hope this helps! Best of luck with your SEO.
-
Thank you Corey for your advise, I see which links it is in google webmasters and in , but I can't reproduce it and don't know whats the best way to fix it?
-
Thomas thank you so much for your advise, and Keri thanks for offering help.
My problem is that when I try to reproduce the 500 error so the host cant help me on how to fix it.
Any help?
-
Hey Keri how are you merry Christmas, I believe that 500 errors are almost always server related errors and unless he tells me about the host or Some other maybe strange unique problem with the computers registry I don't have enough to go on. You be interesting to find out what it is all the best, Tom
-
Hi Yoseph,
Did you get this figured out, or would you still like some assistance?
-
HTTP Error 500 is an Internal Server Error. It's a server-side error, that means there's either a problem with your web server or the code that it's trying to interpret. It may not happen in 100% of scenarios, so you may not always see it happening yourself, but it prevents the page from loading. Obviously, that's bad for search engines and users.
Your best bet in tracking down this error would be to go through your web server's error logs. Or, if you can replicate this happening on the web, you could enable error reporting, and see what errors pop up there. That should tell you how to fix the issue, whatever it may be.
-
I have googled it for you and I definitely think you should contact your web host. Here's what comes up https://my.bluehost.com/cgi/help/594
-
go into the campaign section on seomoz run your site through it. You will then see where the errors are upon seeing error lit up click it use the drop-down to select 500 errors then you will see exactly what link is causing the error.
There is literally no way I can guess what is causing your website guess not to work correctly however a 500 error is a very serious one most likely involving a problem with server.
If you give me your domain I might be able to help more however if your site is just giving 500 errors you might want to call your web host as it sounds like it is not an SEO problem is much as it is a hosting issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long after disallowing Googlebot from crawling a domain until those pages drop out of their index?
We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?
Technical SEO | | TLM0 -
Ajax Optimization in Mobile Site - Ajax Crawling
I'm working on a mobile site that has links embedded in JavaScript/Ajax in the homepage. This functionality is preventing the crawlers for accessing the links to mobile specific URLs. We're using an m. sub-domain. This is just an object in the homepage with an expandable list of links. I was wondering if using the following solution provided by Google will be a good way to help with this situation. https://developers.google.com/webmasters/ajax-crawling/ Thanks!
Technical SEO | | burnseo0 -
Crawl Diagnostics and Duplicate Page Title
SOMOZ crawl our web site and say we have no duplicate page title but Google Webmaster Tool says we have 641 duplicate page titles, Which one is right?
Technical SEO | | iskq0 -
Pagination/Crawl Errors
Hi, Ive only just joined SEO moz and after they crawled my site they came up with 3600 crawl errors mostly being duplicate content and duplicate urls. After researching this it soon became clear it was due to on page pagination and after speaking with Abe from SEO mozhe advised me to take action by getting our developers to implement rel=”next” & rel=”prev” to review. soon after our developers implemented this code ( I have no understanding of this what so ever) 90% of my keywords I had been ranking for in the top 10 have dropped out the top 50! Can anyone explain this or help me with this? Thanks Andy
Technical SEO | | beck3980 -
Can I crawl a password protected domain with SEOmoz?
Hi everyone, Just wondered if anybody has been able to use the SEOmoz site crawler for password protected domains? On Screaming Frog you are prompted for the username and password when you set the crawler running, however SEOmoz doesn't. It seems you can only crawl sites that are live and publicly available - can anyone confirm if this is the case? Cheers, M
Technical SEO | | edlondon0 -
Why the number of crawled pages is so low¿?
Hi, my website is www.theprinterdepo.com and I have been in seomoz pro for 2 months. When it started it crawled 10000 pages, then I modified robots.txt to disallow some specific parameters in the pages to be crawled. We have about 3500 products, so thhe number of crawled pages should be close to that number In the last crawl, it shows only 1700, What should I do?
Technical SEO | | levalencia10 -
Google crawl rate almost zero since re-launch, organic search up 50% though!
We're confused as to why Google's crawl of our site has dropped hugely since our new site went live. The URLs of almost all pages changed, and were 301d to the new site. About 20% of our pages were blocked by robots.txt for the re-launch. The re-launch has been great for organic search, with hits up about 50%. Yet our new content is taking a lot longer to get indexed than before. Our KB downloaded a day according to webmaster tools are well down, as is time spent downloading a page. Any ideas as to why this is?i7hwX.png
Technical SEO | | soulnafein0 -
Internal Link Counts in SEOMoz Report?
Hi, We ran a site diagnostic and it came back with thousands of pages that have more than 100 internal links on a page; however, the actual number of links on those pages seems to be far less than what was reported. Any ideas? Thanks! Phil UPDATE: So we've looked at the source code and realized that for each product we link to the product page in multiple ways - from the product image, product title and price. So we have three internal links to the same page from each product listing, which is being counted by the SEOMoz crawler as hundreds of links on each page. But in terms of the Googlebot, is this as egregious as having hundreds of links to different pages or does it not matter as much?
Technical SEO | | beso1