"5XX (Server Error)" - How can I fix this?
-
Hey Mozers!
Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue?
Crawl Issues and Notices for:
http://www.refusedcarfinance.com/news/category/news
We found 1 crawler issue(s) for this page.
High Priority Issues
1
5XX (Server Error)
5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
-
One more thing I want to add:
Search engines can remove your site from their index after they repeatedly get such a response. To prevent deindexing, use SEO tools to audit your site and detect 5xx errors in time. You can fix errors using three strategies:
Reload the page to check if the issue was merely momentary.
Check the error log on the site.
Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved. -
A 5xx server error is an error which occurs when server doesn’t fulfill a request.
It’s quite difficult to detect and fix each occurrence of these errors, but search engines don’t like them, especially 500 and 503 errors.
-
- Reload the page in case the issue was merely momentary.
- Check the error log on the site.
- Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved.
-
@asim21.. Hi asim.. hope you are well.
What is 5XX server error?
5XX (Sever Error) is occur when a valid request wad made by the client but the server failed to complete the request. Hence there is issue on server side. This error should be fixed for better performace and SEO of your website.
How to Fix..
1. Clear cookies and chaches:
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browsers History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
2. Contact your host/server and creat a support ticket.
If the problem continues, the best step is to contact your hosting provider or create a support ticket and they will fixed this issue on their side . Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again.
I am adding one of my website for promotion.. kindly visit to support me Visit my website
thanks. -
5xx server errors present a big issue because they directly harm your visitor’s experience. Imagine having an eCommerce store, and your visitors keep receiving these server errors. Do you think they’ll keep trying and eventually purchase something? Hello I'm the Owner of Reactormoney.com
No, they’ll go next door and buy from the competition.
And what about search engines like Google? If they receive 5xx server errors just occasionally, then you probably don’t have much to fear. But if the 5xx server errors are persistent, then it’s likely you’ll see a steady decline in organic traffic.
In this article we’ll cover why 5xx server errors are bad for SEO, what the most common 5xx server errors are, how to find out if your site’s server returns them, what causes this, and how to fix them.
So we need to fix it.
3 Simple Steps to Clearing a 5XX Error
5XX errors are server-side meaning the problem is not likely to lie within your internet connection or device. There will be an error with your website server.
In the unlikely event that there is an issue with something on your end, there are a few simple steps you can take before seeking further help and advice.
1. Refresh your browser
The problem may only be temporary, reloading the page will often prove successful. You can do this by resubmitting the URL from the address bar, pressing F5 or Ctrl-R on your keyboard.
2. Remove cookies
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browser's History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
3. Contact your host/server
If the problem continues, the best step is to contact your host or server directly to find out what the problem is. Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again. Here are articles that might help: Best fish finder under 1000
-
5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are articles that might help: best gaming laptop for sims 4
-
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
-
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
-
Hi Maria,
I'm also facing the same issue. Can you guide me how to resolve it because your issue is old I think till now you got a solution. I'm also facing issue on my Page URL. I'm waiting for your response as I also have android gaming blog so may be your solution work for me.
-
Hi,
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
Is there any issue with server-side rendering? Please help me as I want to fix this ASAP.
Awaiting help on this!
-
Hi there! Sorry to hear you're having trouble.
Our crawl error guide is a great place to start to find more information. You might consider running a Crawl Test on the site to validate the error, as it could possibly have been a temporary issue. If you're still seeing the error crop up, it probably warrants additional investigation.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are some articles that might help:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
How should I deal with "duplicate" content in an Equipment Database?
The Moz Crawler is identifying hundreds of instances of duplicate content on my site in our equipment database. The database is similar in functionality to a site like autotrader.com. We post equipment with pictures and our customers can look at the equipment and make purchasing decisions. The problem is that, though each unit is unique, they often have similar or identical specs which is why moz (and presumably google/bing) are identifying the content as "duplicate". In many cases, the only difference between listings are the pictures and mileage- the specifications and year are the same. Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site. Any advice would be appreciated.
Technical SEO | | DohenyDrones0 -
Both links with ".html" and without are working , Is that a problem ?
Default format of my url ending with ".html" , I know it's not a problem .. But both links with ".html" and without are working , Is that critical problem or not ? and how to solve it ?
Technical SEO | | Mohamed_Samer0 -
Missing "Mobile Friendly" Tag in Google
Hi All, I have noticed that Google are not displaying a mobile friendly tag next to our website (www.wombatwebdesign.com). We made it responsive over a year ago and it is running on Joomla 3.X, as recommended by Google. I have run it through google checking tool and it confirms it is mobile friendly. So why no mobile friendly tag? Any ideas gratefully received. Thanks Fraser
Technical SEO | | fraserhannah0 -
Error in webmaster tools
Hi, I just got an error (12 pages especifically) from webmaster tools when consulting "indexing problems" Something like: The URL doesn't exist, but the server doesn't return a 404 error. What should I do? Many Thanks.
Technical SEO | | juanmiguelcr0 -
Should i do "Article Marketing" for my quotes site?
Hello members, Should i do Article Marketing for my quote site to have quality backlinks to my site? will it improve my rankings?
Technical SEO | | rimon56930 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0