HTTP 500 Internal Server Error, Need help
-
Hi,
For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting:
<code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code>
and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ
How can i fix that?
Thank you
-
Check your .htaccess file to see if something is going wrong there.
Another possibility could be that you did something wrong in the robots.txt file.It could also be a server issue but since we users can see the page just fine i'm thinking it's either the .htaccess or robots.txt file.
Let me know if you fix it and if not i'll try to help some more.
-
I crawled the first 1,000 links on your site and did not come across any 500 errors, I did find you have a huge amount f un-necessary redirects, also a lot of pages with more then one canonical tag in them.
I would suggest that the 500 errors are either on pages I did not crawl or are intermittent, if intermittent then it is probably the server
-
I am not a programmer, but I'll give my best to answer.
- Check your .htaccess. Your site shouldn't suddenly be giving 500 errors out of nowhere.
2) Maybe it's the settings on the server. Did you just move the site? If so, do a little more digging into what the problem could be. Your programmer can take care of this. If you are with Rackspace, contact them via Chat and they can help you out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http to https redirection issue
Hi, i have a website with http but now i moved to https. when i apply 301 redirection from http to https & check in semrush it shows unable to connect with https & similar other tool shows & when i remove redirection all other tools working fine but my https version doesn't get indexed in google. can anybosy help what could be the issue?
Technical SEO | | dhananjay.kumar10 -
Homepage 301 and SEO Help
Hi All, Does redirecting alternate versions of my homepage with a 301 only improve reporting, or are there SEO benefits as well. We recently changed over our servers and this wasn't set-up as before and I've noticed a drop in our organic search traffic. i.e. there was no 301 sending mywebsite.com traffic to www.mywebsite.com Thanks in advance for any comments or help.
Technical SEO | | b4cab0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Ranking Internationally
Whats the best URL strategy to rank for one term in different countries? We currently rank well in Australia, but are tackling many countries over the next few months. The page I am ranking is http://www.spendbitcoins.com/buy for the term 'buy bitcoins' or 'buy bitcoins in Australia' (replace Australia with each specific country, i.e. New Zealand, Japan, etc.) These are the strategies I have come up with, is one of these good or is there a different better way? Use country specific TLDs, pulling the content from the same site w/ translations Use country specific TLDs and create separate sites and create completely original content Use country specific TLDs, forwarding to the primary domain use a subdomain from the same site modify the page link to be something like spendbitcoins.com/buy/new-zealand Don't worry about any of this and just get links from sites with the proper TLD
Technical SEO | | jaychristopher0 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
Too Many Internal Links?
Hi Guys, I'm completing a overhawl of our website at the moment have a certain penguin killed our site for our main keyword. I'm currently working on our internal linking as most of our blog posts have a link back to our home page with the main money keyword. At present we have 3,331 internal links and our site has only 1,000 pages. Can you get penalised for having too many internal links with exact match anchors. Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Help needed with robots.txt regarding wordpress!
Here is my robots.txt from google webmaster tools. These are the pages that are being blocked and I am not sure which of these to get rid of in order to unblock blog posts from being searched. http://ensoplastics.com/theblog/?cat=743 http://ensoplastics.com/theblog/?p=240 These category pages and blog posts are blocked so do I delete the /? ...I am new to SEO and web development so I am not sure why the developer of this robots.txt file would block pages and posts in wordpress. It seems to me like that is the reason why someone has a blog so it can be searched and get more exposure for SEO purposes. IS there a reason I should block any pages contained in wodrpress? Sitemap: http://www.ensobottles.com/blog/sitemap.xml User-agent: Googlebot Disallow: /*/trackback Disallow: /*/feed Disallow: /*/comments Disallow: /? Disallow: /*? Disallow: /page/
Technical SEO | | ENSO
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Disallow: /trackback Disallow: /commentsDisallow: /feed0