Best practices for fixing 500 errors
-
How should I deal with a 500 errors?
-
Hi Josh,
If you have a list of actual URLs which always return a 500 Error, the problem is a bug in the code.
You should send the list of URLs to your developer and ask him/her to fix them. Often, once a bug is identified and fixed, it will correct all or most of the errors.
If you check the URLs and find that the 500 Errors are intermittent (sometimes OK, sometimes returning an error), you should take a look at this question I answered about random 500 Errors.
Hope that helps,
Sha
-
The CMS is called Modx. The developer has not given me access to the htaccess, because there is a redirector set-up in the CMS (BS!).
I have a list of 135 500 errors via open site explorer. What could be malicious about the 500 errors?
-
Depends on what framework or CMS you are using. Do you have any more info? Check your htaccess file for any malicious code.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
Suggest me a best plan for linking building chart for small static website.
Hi Everyone, Can any one suggest me a clear idea for off page link building chart i.e) Our page is a 24 page website we like to plan for off page activity like bookmarking, classifieds, directory bla bla bla. So how many links we are supposed to post and in how much day time gap example: 15 Links in bookmarking, 10 links in classified, weekly one article submission, after one week the same cycle goes on.....
White Hat / Black Hat SEO | | dineshmap0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
White Hat / Black Hat SEO | | makote0 -
Fix Bad Links in Google
I have a client who had some grey hat SEO done in the past. Some of their back links aren't from the best neighborhoods. Google didn't seem to mind until 9/28, when they literally disappeared for all searches except for their domain name. Google still has their site indexed, but it's just not showing up. There are no messages in Webmaster Tools. I know Bing has the tool where you can disavow bad links and ask them to discount them. Google doesn't have such a tool, but what is the strategy when you don't have control over the link sources, such as in blog comments? Could this update have been a delayed Penguin ranking change from the latest Penguin Update on the 18th? http://www.seomoz.org/google-algorithm-change Any advice would be greatly appreciated. Thanks, Tom
White Hat / Black Hat SEO | | TomBristol0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Best way to handle expired ad in a classified
I don't think there is a definitive answer to this, but worth the discussion: How to handle an expired ad in a classified / auction site? Michael Gray mentioned you should 301 it to it's category page, and I'm inclined to agree with him. But some analysts say you should return a "product/ad expired" page with a 404. For the user I think the 404 aproach is best, but from a SEO perspective that means I'm throwing link juice out. What if I 301 him from the ad, and show a message saying why they're seeing the listing page instead of the product page? Thoughts?
White Hat / Black Hat SEO | | mirum_agency0 -
Which of these elements are good / bad link building practices?
Hi, I need some help. I recently got some help with an seo project from a contractor. He did 50 directory submissions and 50 article submissions. I got good results, going up about 20 places (still a long way to the first page!) on google.co.uk on a tough key word Since this project I learned article marketing is not cool. So I am wondering about what I should do next. The contractor has proposed a new bigger project consisting of the elements listed below. I don’t know which of these elements are ok and which aren’t. If they are not ok are they: 1) a waste of time or 2) something I could get penalized for? Let me know what you think?? Thanks, Andrew 100 ARTICLE SUBMISSIONS [APPROVED ARTICLES] -> 1 article submitted to 100 article directories 50 PRESS RELEASE SUBMISSIONS [APPROVED & SCREENSHOTS]-> 1 PR writing & submissions to top 50 PR distribution sites each 150 PRIVATE BLOGS SUBMISSION [APPROVED ARTICLES] -> 1 article submitted to 150 private blogs submission 100 WEBSITE DIRECTORY SUBMISSION -> 1 url (home page) submitted to 100 top free web directories 50 SOCIAL BOOKMARKING [CONFIRMED LINKS] -> 1 url of site submitted to top 50 social bookmarking websites 40 PROFILE BACK-LINKS [CONFIRMED LINKS] -> 1-3 url's of site submitted and create 40 profile websites 50 SEARCH ENGINES -> submission to all the major search engines 20 NEWS WEBSITES -> Ping all links from reports to news websites
White Hat / Black Hat SEO | | fleurya0