Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
"5XX (Server Error)" - How can I fix this?
-
Hey Mozers!
Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue?
Crawl Issues and Notices for:
http://www.refusedcarfinance.com/news/category/news
We found 1 crawler issue(s) for this page.
High Priority Issues
1
5XX (Server Error)
5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
-
One more thing I want to add:
Search engines can remove your site from their index after they repeatedly get such a response. To prevent deindexing, use SEO tools to audit your site and detect 5xx errors in time. You can fix errors using three strategies:
Reload the page to check if the issue was merely momentary.
Check the error log on the site.
Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved. -
A 5xx server error is an error which occurs when server doesn’t fulfill a request.
It’s quite difficult to detect and fix each occurrence of these errors, but search engines don’t like them, especially 500 and 503 errors.
-
- Reload the page in case the issue was merely momentary.
- Check the error log on the site.
- Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved.
-
@asim21.. Hi asim.. hope you are well.
What is 5XX server error?
5XX (Sever Error) is occur when a valid request wad made by the client but the server failed to complete the request. Hence there is issue on server side. This error should be fixed for better performace and SEO of your website.
How to Fix..
1. Clear cookies and chaches:
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browsers History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
2. Contact your host/server and creat a support ticket.
If the problem continues, the best step is to contact your hosting provider or create a support ticket and they will fixed this issue on their side . Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again.
I am adding one of my website for promotion.. kindly visit to support me Visit my website
thanks. -
5xx server errors present a big issue because they directly harm your visitor’s experience. Imagine having an eCommerce store, and your visitors keep receiving these server errors. Do you think they’ll keep trying and eventually purchase something? Hello I'm the Owner of Reactormoney.com
No, they’ll go next door and buy from the competition.
And what about search engines like Google? If they receive 5xx server errors just occasionally, then you probably don’t have much to fear. But if the 5xx server errors are persistent, then it’s likely you’ll see a steady decline in organic traffic.
In this article we’ll cover why 5xx server errors are bad for SEO, what the most common 5xx server errors are, how to find out if your site’s server returns them, what causes this, and how to fix them.
So we need to fix it.
3 Simple Steps to Clearing a 5XX Error
5XX errors are server-side meaning the problem is not likely to lie within your internet connection or device. There will be an error with your website server.
In the unlikely event that there is an issue with something on your end, there are a few simple steps you can take before seeking further help and advice.
1. Refresh your browser
The problem may only be temporary, reloading the page will often prove successful. You can do this by resubmitting the URL from the address bar, pressing F5 or Ctrl-R on your keyboard.
2. Remove cookies
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browser's History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
3. Contact your host/server
If the problem continues, the best step is to contact your host or server directly to find out what the problem is. Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again. Here are articles that might help: Best fish finder under 1000
-
5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are articles that might help: best gaming laptop for sims 4
-
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
-
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
-
Hi Maria,
I'm also facing the same issue. Can you guide me how to resolve it because your issue is old I think till now you got a solution. I'm also facing issue on my Page URL. I'm waiting for your response as I also have android gaming blog so may be your solution work for me.
-
Hi,
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
Is there any issue with server-side rendering? Please help me as I want to fix this ASAP.
Awaiting help on this!
-
Hi there! Sorry to hear you're having trouble.
Our crawl error guide is a great place to start to find more information. You might consider running a Crawl Test on the site to validate the error, as it could possibly have been a temporary issue. If you're still seeing the error crop up, it probably warrants additional investigation.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are some articles that might help:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
New "Static" Site with 302s
Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?
Technical SEO | | danny.wood1 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
Why crawl error "title missing or empty" when there is already "title and meta desciption" in place?
I've been getting 73 "title missing or empty" warnings from SEOMOZ crawl diagnostic. This is weird as I've installed yoast wordpress seo plugin and all posts do have title and meta description. But why the results here.. can anyone explain what's happening? Thanks!! Here are some of the links that are listed with "title missing, empty". Almost all our blog posts were listed there. http://www.gan4hire.com/blog/2011/are-you-here-for-good/ http://www.gan4hire.com/blog/2011/are-you-socially-awkward/
Technical SEO | | JasonDGreatMaeM3.png TLcD8.png
0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
Meta tag "noindex,nofollow" by accident
Hi, 3 weeks ago I wanted to release a new website (made in WordPress), so I neatly created 301 redirects for all files and folders of my old html website and transferred the WordPress site into the index folder. Job well done I thought, but after a few days, my site suddenly disappeared from google. I read in other Q&A's that this could happen so I waited a little longer till I finally saw today that there was a meta robots added on every page with "noindex, nofollow". For some reason, the WordPress setting "I want to forbid search engines, but allow normal visitors to my website" was selected, although I never even opened that section called "Privacy". So my question is, will this have a negative impact on my pagerank afterwards? Thanks, Sven
Technical SEO | | Zitana0 -
Which pages to "noindex"
I have read through the many articles regarding the use of Meta Noindex, but what I haven't been able to find is a clear explanation of when, why or what to use this on. I'm thinking that it would be appropriate to use it on: legal pages such as privacy policy and terms of use
Technical SEO | | mmaes
search results page
blog archive and category pages Thanks for any insight of this.0