"5XX (Server Error)" - How can I fix this?
-
Hey Mozers!
Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue?
Crawl Issues and Notices for:
http://www.refusedcarfinance.com/news/category/news
We found 1 crawler issue(s) for this page.
High Priority Issues
1
5XX (Server Error)
5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
-
One more thing I want to add:
Search engines can remove your site from their index after they repeatedly get such a response. To prevent deindexing, use SEO tools to audit your site and detect 5xx errors in time. You can fix errors using three strategies:
Reload the page to check if the issue was merely momentary.
Check the error log on the site.
Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved. -
A 5xx server error is an error which occurs when server doesn’t fulfill a request.
It’s quite difficult to detect and fix each occurrence of these errors, but search engines don’t like them, especially 500 and 503 errors.
-
- Reload the page in case the issue was merely momentary.
- Check the error log on the site.
- Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved.
-
@asim21.. Hi asim.. hope you are well.
What is 5XX server error?
5XX (Sever Error) is occur when a valid request wad made by the client but the server failed to complete the request. Hence there is issue on server side. This error should be fixed for better performace and SEO of your website.
How to Fix..
1. Clear cookies and chaches:
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browsers History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
2. Contact your host/server and creat a support ticket.
If the problem continues, the best step is to contact your hosting provider or create a support ticket and they will fixed this issue on their side . Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again.
I am adding one of my website for promotion.. kindly visit to support me Visit my website
thanks. -
5xx server errors present a big issue because they directly harm your visitor’s experience. Imagine having an eCommerce store, and your visitors keep receiving these server errors. Do you think they’ll keep trying and eventually purchase something? Hello I'm the Owner of Reactormoney.com
No, they’ll go next door and buy from the competition.
And what about search engines like Google? If they receive 5xx server errors just occasionally, then you probably don’t have much to fear. But if the 5xx server errors are persistent, then it’s likely you’ll see a steady decline in organic traffic.
In this article we’ll cover why 5xx server errors are bad for SEO, what the most common 5xx server errors are, how to find out if your site’s server returns them, what causes this, and how to fix them.
So we need to fix it.
3 Simple Steps to Clearing a 5XX Error
5XX errors are server-side meaning the problem is not likely to lie within your internet connection or device. There will be an error with your website server.
In the unlikely event that there is an issue with something on your end, there are a few simple steps you can take before seeking further help and advice.
1. Refresh your browser
The problem may only be temporary, reloading the page will often prove successful. You can do this by resubmitting the URL from the address bar, pressing F5 or Ctrl-R on your keyboard.
2. Remove cookies
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browser's History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
3. Contact your host/server
If the problem continues, the best step is to contact your host or server directly to find out what the problem is. Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again. Here are articles that might help: Best fish finder under 1000
-
5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are articles that might help: best gaming laptop for sims 4
-
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
-
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
-
Hi Maria,
I'm also facing the same issue. Can you guide me how to resolve it because your issue is old I think till now you got a solution. I'm also facing issue on my Page URL. I'm waiting for your response as I also have android gaming blog so may be your solution work for me.
-
Hi,
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
Is there any issue with server-side rendering? Please help me as I want to fix this ASAP.
Awaiting help on this!
-
Hi there! Sorry to hear you're having trouble.
Our crawl error guide is a great place to start to find more information. You might consider running a Crawl Test on the site to validate the error, as it could possibly have been a temporary issue. If you're still seeing the error crop up, it probably warrants additional investigation.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are some articles that might help:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical error from Google
Moz couldn't explain this properly and I don't understand how to fix it. Google emailed this morning saying "Alternate page with proper canonical tag." Moz also kinda complains about the main URL and the main URL/index.html being duplicate. Of course they are. The main URL doesn't work without the index.html page. What am I missing? How can I fix this to eliminate this duplicate problem which to me isn't a problem?
Technical SEO | | RVForce0 -
"No Meta Description Tag"
Google is not showing Meta Description for the Keyword Rankings of my website in the SERPs. All of my Keywords Ranking are coming with just two fields. Which are just 1. Title Tag & 2. Page URL. The description tag is missing in it. Here is a proof Kindly advice please.
Technical SEO | | seobac1 -
"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical. The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
Technical SEO | | RosemaryB
https://support.google.com/webmasters/answer/44231?hl=en GWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard. I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links. Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content? Thanks for your feedback. Rosemary0 -
How to fix this issue?
I redesign my website from Wix to HTML. Now URLs changed to _ http://www.spinteedubai.com/#!how-it-works/c46c To http://www.spinteedubai.com/how-it-works.html Same for all other pages. How I can fix this issue and both pages were also indexed in google.
Technical SEO | | AlexanderWhite0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0