Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
"5XX (Server Error)" - How can I fix this?
-
Hey Mozers!
Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue?
Crawl Issues and Notices for:
http://www.refusedcarfinance.com/news/category/news
We found 1 crawler issue(s) for this page.
High Priority Issues
1
5XX (Server Error)
5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
-
One more thing I want to add:
Search engines can remove your site from their index after they repeatedly get such a response. To prevent deindexing, use SEO tools to audit your site and detect 5xx errors in time. You can fix errors using three strategies:
Reload the page to check if the issue was merely momentary.
Check the error log on the site.
Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved. -
A 5xx server error is an error which occurs when server doesn’t fulfill a request.
It’s quite difficult to detect and fix each occurrence of these errors, but search engines don’t like them, especially 500 and 503 errors.
-
- Reload the page in case the issue was merely momentary.
- Check the error log on the site.
- Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved.
-
@asim21.. Hi asim.. hope you are well.
What is 5XX server error?
5XX (Sever Error) is occur when a valid request wad made by the client but the server failed to complete the request. Hence there is issue on server side. This error should be fixed for better performace and SEO of your website.
How to Fix..
1. Clear cookies and chaches:
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browsers History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
2. Contact your host/server and creat a support ticket.
If the problem continues, the best step is to contact your hosting provider or create a support ticket and they will fixed this issue on their side . Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again.
I am adding one of my website for promotion.. kindly visit to support me Visit my website
thanks. -
5xx server errors present a big issue because they directly harm your visitor’s experience. Imagine having an eCommerce store, and your visitors keep receiving these server errors. Do you think they’ll keep trying and eventually purchase something? Hello I'm the Owner of Reactormoney.com
No, they’ll go next door and buy from the competition.
And what about search engines like Google? If they receive 5xx server errors just occasionally, then you probably don’t have much to fear. But if the 5xx server errors are persistent, then it’s likely you’ll see a steady decline in organic traffic.
In this article we’ll cover why 5xx server errors are bad for SEO, what the most common 5xx server errors are, how to find out if your site’s server returns them, what causes this, and how to fix them.
So we need to fix it.
3 Simple Steps to Clearing a 5XX Error
5XX errors are server-side meaning the problem is not likely to lie within your internet connection or device. There will be an error with your website server.
In the unlikely event that there is an issue with something on your end, there are a few simple steps you can take before seeking further help and advice.
1. Refresh your browser
The problem may only be temporary, reloading the page will often prove successful. You can do this by resubmitting the URL from the address bar, pressing F5 or Ctrl-R on your keyboard.
2. Remove cookies
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browser's History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
3. Contact your host/server
If the problem continues, the best step is to contact your host or server directly to find out what the problem is. Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again. Here are articles that might help: Best fish finder under 1000
-
5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are articles that might help: best gaming laptop for sims 4
-
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
-
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
-
Hi Maria,
I'm also facing the same issue. Can you guide me how to resolve it because your issue is old I think till now you got a solution. I'm also facing issue on my Page URL. I'm waiting for your response as I also have android gaming blog so may be your solution work for me.
-
Hi,
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
Is there any issue with server-side rendering? Please help me as I want to fix this ASAP.
Awaiting help on this!
-
Hi there! Sorry to hear you're having trouble.
Our crawl error guide is a great place to start to find more information. You might consider running a Crawl Test on the site to validate the error, as it could possibly have been a temporary issue. If you're still seeing the error crop up, it probably warrants additional investigation.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are some articles that might help:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
404 errors
Hi I am getting these show up in WMT crawl error any help would be very much appreciated | ?escaped_fragment=Meditation-find-peace-within/csso/55991bd90cf2efdf74ec3f60 | 404 | 12/5/15 |
Technical SEO | | ReSEOlve
| | 2 | mobile/?escaped_fragment= | 404 | 10/26/15 |
| | 3 | ?escaped_fragment=Tips-for-a-balanced-lifestyle/csso/1 | 404 | 12/1/15 |
| | 4 | ?escaped_fragment=My-favorite-yoga-spot/csso/5598e2130cf2585ebcde3b9a | 404 | 12/1/15 |
| | 5 | ?escaped_fragment=blog/c19s6 | 404 | 11/29/15 |
| | 6 | ?escaped_fragment=blog/c19s6/Tag/yoga | 404 | 11/30/15 |
| | 7 | ?escaped_fragment=Inhale-exhale-and-once-again/csso/2 | 404 | 11/27/15 |
| | 8 | ?escaped_fragment=classes/covl | 404 | 10/29/15 |
| | 9 | m/?escaped_fragment= | 404 | 10/26/15 |
| | 10 | ?escaped_fragment=blog/c19s6/Page/1 | 404 | 11/30/15 | | |0 -
How do I "undo" or remove a Google Search Console change of address?
I have a client that set a change of address in Google Search Console where they informed Google that their preferred domain was a subdomain, and now they want Google to also consider their base domain (without the change of address). How do I get the change of address in Google search console removed?
Technical SEO | | KatherineWatierOng0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
How is a dash or "-" handled by Google search?
I am targeting the keyword AK-47 and it the variants in search (AK47, AK-47, AK 47) . How should I handle on page SEO? Right now I have AK47 and AK-47 incorporated. So my questions is really do I need to account for the space or is Google handling a dash as a space? At a quick glance of the top 10 it seems the dash is handled as a space, but I just wanted to get a conformation from people much smarter then I at seomoz. Thanks, Jason
Technical SEO | | idiHost0 -
Should we use "and" or "&"?
Our client has an ampersand in their brand name. The logo has "&", their url is spelled out. I'm trying to get them to standardize the use of the name for directories/listings. Should we use "and" or "&"?
Technical SEO | | vernonmack0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
Meta tag "noindex,nofollow" by accident
Hi, 3 weeks ago I wanted to release a new website (made in WordPress), so I neatly created 301 redirects for all files and folders of my old html website and transferred the WordPress site into the index folder. Job well done I thought, but after a few days, my site suddenly disappeared from google. I read in other Q&A's that this could happen so I waited a little longer till I finally saw today that there was a meta robots added on every page with "noindex, nofollow". For some reason, the WordPress setting "I want to forbid search engines, but allow normal visitors to my website" was selected, although I never even opened that section called "Privacy". So my question is, will this have a negative impact on my pagerank afterwards? Thanks, Sven
Technical SEO | | Zitana0