Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
"5XX (Server Error)" - How can I fix this?
-
Hey Mozers!
Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue?
Crawl Issues and Notices for:
http://www.refusedcarfinance.com/news/category/news
We found 1 crawler issue(s) for this page.
High Priority Issues
1
5XX (Server Error)
5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
-
One more thing I want to add:
Search engines can remove your site from their index after they repeatedly get such a response. To prevent deindexing, use SEO tools to audit your site and detect 5xx errors in time. You can fix errors using three strategies:
Reload the page to check if the issue was merely momentary.
Check the error log on the site.
Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved. -
A 5xx server error is an error which occurs when server doesn’t fulfill a request.
It’s quite difficult to detect and fix each occurrence of these errors, but search engines don’t like them, especially 500 and 503 errors.
-
- Reload the page in case the issue was merely momentary.
- Check the error log on the site.
- Consider any changes or upgrades to the system that you have carried out recently and roll them back until the issue is resolved.
-
@asim21.. Hi asim.. hope you are well.
What is 5XX server error?
5XX (Sever Error) is occur when a valid request wad made by the client but the server failed to complete the request. Hence there is issue on server side. This error should be fixed for better performace and SEO of your website.
How to Fix..
1. Clear cookies and chaches:
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browsers History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
2. Contact your host/server and creat a support ticket.
If the problem continues, the best step is to contact your hosting provider or create a support ticket and they will fixed this issue on their side . Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again.
I am adding one of my website for promotion.. kindly visit to support me Visit my website
thanks. -
5xx server errors present a big issue because they directly harm your visitor’s experience. Imagine having an eCommerce store, and your visitors keep receiving these server errors. Do you think they’ll keep trying and eventually purchase something? Hello I'm the Owner of Reactormoney.com
No, they’ll go next door and buy from the competition.
And what about search engines like Google? If they receive 5xx server errors just occasionally, then you probably don’t have much to fear. But if the 5xx server errors are persistent, then it’s likely you’ll see a steady decline in organic traffic.
In this article we’ll cover why 5xx server errors are bad for SEO, what the most common 5xx server errors are, how to find out if your site’s server returns them, what causes this, and how to fix them.
So we need to fix it.
3 Simple Steps to Clearing a 5XX Error
5XX errors are server-side meaning the problem is not likely to lie within your internet connection or device. There will be an error with your website server.
In the unlikely event that there is an issue with something on your end, there are a few simple steps you can take before seeking further help and advice.
1. Refresh your browser
The problem may only be temporary, reloading the page will often prove successful. You can do this by resubmitting the URL from the address bar, pressing F5 or Ctrl-R on your keyboard.
2. Remove cookies
Sometimes 5xx errors are due to the cookies related to the website, so deleting these and refreshing the browser can often solve the problem. To do this, enter your web browser's History and select Delete. To remove the cookies from some devices you may need to check the box next to Cookies before hitting delete. Refresh your page and check to see if the error code represents itself.
3. Contact your host/server
If the problem continues, the best step is to contact your host or server directly to find out what the problem is. Chances are they are already on top of it or undergoing maintenance, but this will help put your mind at ease and give an idea of when it may be up and running again. Here are articles that might help: Best fish finder under 1000
-
5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are articles that might help: best gaming laptop for sims 4
-
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
-
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
-
Hi Maria,
I'm also facing the same issue. Can you guide me how to resolve it because your issue is old I think till now you got a solution. I'm also facing issue on my Page URL. I'm waiting for your response as I also have android gaming blog so may be your solution work for me.
-
Hi,
My Search Console account showing 2k Server Errors for both Desktop & Smartphone. I'm confused why it is showing these errors, as when I open these URLs on Desktop as well as on Mobile browser they are working fine.
What does that mean, do I need to work on it? and How? is there any problem with Google Crawling? Is Google cannot crawl these pages?
If the URL is opening and not showing any error it means The URL is fine? Can I ignore the Search Console error?
Is there any issue with server-side rendering? Please help me as I want to fix this ASAP.
Awaiting help on this!
-
Hi there! Sorry to hear you're having trouble.
Our crawl error guide is a great place to start to find more information. You might consider running a Crawl Test on the site to validate the error, as it could possibly have been a temporary issue. If you're still seeing the error crop up, it probably warrants additional investigation.
Usually a 5XX error means an issue on your server, so you might wish to inquire with your web dev about the cause. If you're still not sure what to do, the Q&A forum is generally a great place to look for ideas in resolving some of these issues. Here are some articles that might help:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
SEO advice on ecommerce url structure where categories contain "/c/"
Hi! We use Hybris as plattform and I would like input on which url to choose. We must keep "/c/" before the actual category. c stands for category. I.e. this current url format will be shortened and cleaned:
Technical SEO | | hampgunn
https://www.granngarden.se/Sortiment/Husdjur/Hund/Hundfoder-%26-Hundmat/c/hundfoder To either: a.
https://www.granngarden.se/husdjur/hund/hundfoder/c/hundfoder b.
https://www.granngarden.se/husdjur/hund/c/hundfoder (hundfoder means dogfood) The question is whether we should keep the duplicated category name (hundfoder) before the "/c/" or not. Will there be SEO disadvantages by removing the duplicate "hundfoder" before the "/c/"? I prefer the shorter version ofc, but do not want to jeopardize any SEO rankings or send confusing signals to search engines or customers due to the "/c/" breaking up the url breadcrumb. What do you guys say and prefer from the above alternatives? Thanks /Hampus0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Rel="Follow"? What the &#@? does that mean?
I've written a guest blog post for a site. In the link back to my site they've put a rel="follow" attribute. Is that valid HTML? I've Googled it but the answers are inconclusive, to say the least.
Technical SEO | | Jeepster0 -
How is a dash or "-" handled by Google search?
I am targeting the keyword AK-47 and it the variants in search (AK47, AK-47, AK 47) . How should I handle on page SEO? Right now I have AK47 and AK-47 incorporated. So my questions is really do I need to account for the space or is Google handling a dash as a space? At a quick glance of the top 10 it seems the dash is handled as a space, but I just wanted to get a conformation from people much smarter then I at seomoz. Thanks, Jason
Technical SEO | | idiHost0 -
301 for "index.php" in Web.config?
Hi there, I'm trying to create a 301 redirect for the file "index.php" but I keep getting a "fail to redirect" message in Firefox whenever I insert it into the Web.config file. <location path="index.php"></location> Is there anyway around this? Thanks for any help According to Open Site Explorer, there are about 500 links to my index file but it only has a 302 status so will not be passing link juice.
Technical SEO | | tdsnet0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0