Can an incorrect 301 redirect or .htaccess code cause 500 errors?
-
Google Webmaster Tools is showing the following message:
_Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _
Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly?
Here is the 301 redirect code I am using in .htaccess:
RewriteEngine On
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/
RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L]
RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
Could adding the following code after that in the .htaccess potentially cause any issues?
BEGIN EXPIRES
<ifmodule mod_expires.c="">ExpiresActive On
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule>END EXPIRES
(Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess:
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule>END WordPress
Thanks
-
Just to follow up on your last question about 404s, Kim...
No, having a bunch of 404s like that will be no more work for the server than if they were landing on actual blog pages - in fact somewhat less work as the 404 page generally has less content and far fewer database calls.
Also, a page timing out due to server load (server working too hard) doesn't generally result in a 500 error, it just returns a timed-out error. 500 errors are delivered when something actually breaks the server's ability to deliver the correct page content.
Paul
-
Wow, you are very quickly and easily making me much better at what I do:) Thanks for that.
I actually just updated the code a couple days ago by adding the Expires code and fixing the redirect. Maybe the previous double 301 redirect could be the culprit? Or - something I mentioned in another question - there were a ton of 404s because of a blog that wasn't redirected to the /blog subdirectory correctly, which I fixed recently. Could something like that cause the server to work to hard and return a 500 server error?
I'll definitely check out the logs and Pingdom.
Great information and advice.
-
Sorry - and to be clear about your htaccess testing question - no there's no "tool" I've ever heard of. You test it by doing exactly as you've done - ensuring that pages respond correctly and with correct headers. Then you implement a monitoring system to ensure that you know every time that correct behaviour fails. That way you can get the site back up quickly, and have a record of when & how often it happened so you can properly troubleshoot if you have an issue.
Three troubleshooting steps
- become aware as soon as there is a problem
- fix the problem asap to minimize impact on users
- investigate and fix the root cause so it doesn't happen again.
All of these steps depend on a monitoring/alerting system, otherwise you'll always be behind the curve and/or working in the dark.
Hope that helps?
Paul
-
Great answer Paul.
-
As far as I understand, Kimberly, you've only changed the htaccess in the last day or 2? in which case the server error would have been from before your updates.
As far as monitoring - you can check the server error logs (via FTP or in cPanel if that's what the hosting account uses) to check for frequent 500-level server errors.
In addition, I strongly recommend that all commercial sites must have uptime monitoring in place. I like to use Pingdom's paid basic plan which allows monitoring of up to 10 pages. I then select a number of relevant pages and set the tool to test each page, and to check for an actual text snippet on each page (using their custom settings). I monitor things like the home page, the blog home page, a blog post, a blog category page, and critical call-to-action pages. Basically different types of page templates that might respond differently to server issues. plus critical money-making pages.
This way, Pingdom will alert you immediately any time those pages don't respond normally (like when a server gives back a 500 error, or the server goes unresponsive due to overload etc). Monitoring these pages every minute is the ONLY way to really know whether your server and website software are performing properly and consistently. This is a critical component of any professionally run website, in my opinion.
Often Pingdom confirms that things are running fine, but I literally can't count the number of times I've instituting uptime monitoring for new clients, only to find the site has huge downtime no one was really aware of, because they just aren't on their own site often enough to know when it's down. (And you certainly shouldn't be relying on customers to inform you the site has issues. By then it's FAR too late.)
Paul
P.S There are certainly other uptime monitoring systems out there, some are even free. I recommend Pingdom because I've used it for years and it's been consistently excellent. Also, it allows for per-minute checks instead of every 5 minutes, and can check for actual page content, not just server response. In addition, when it finds an outage, it runs a root cause analysis. So it would actually tell you that a 500 error caused the check failure (as opposed to server timing out, which is a different problem). No other affiliation.
-
Paul - Thanks for a new way to check and understand all this.
So, if I was able to visit the page just fine normally, and after setting the user agent to Googlebot, then I should be good? I never saw a 500 server error while visiting the page, just in Webmaster Tools. It was dated 2 days ago, but there have been other server error warnings over the past month or two in GWT, so maybe it is a resolved issue.
Can you suggest a method to confirm the overall proper functioning of the .htaccess code? Is there a tool you use to validate your .htaccess code? I checked response headers in Firebug and found all 200 OKs and 304s for images (from the expires header I assume) so to my amateur viewpoint, it looks good. I just don't want to tank the site unwittingly. Obviously not.
-
To note, Kimberly - Webmaster tools keeps a historical record of issues. It may be showing you server error that occurred in the past, but is no longer a problem. Easiest way is to test the URL it is reporting as having problems.
First test by visiting the URL using a regular browser. Then revisit using a regular browser, but with the user-agent set to imitate the Googlebot crawler since it's Googlebot that's reporting the error. (You can do this using the Set User Agent tool built into the Moz Firefox toolbar, or others. It's a critical capability to have for many purposes.) It's possible for the Googlebot to have issues even if a regular visitor sees none, so you want to test for both.
Assuming these tests return the 500 server error, just briefly rename the pertinent htaccess file for a minute, then go back and rerun the tests. If the error goes away with the htaccess disabled, you know the source of the problem lies in the htaccess rules. If the problem persists, you can be pretty certain it's not the htaccess causing it.
Make sense?
Paul
-
Kimberly,
It can, but without which 5XX it is, it is harder to diagnose. (Is it an endless loop, or something else)
I would suggest (based on you trying to redirect what appears to be homepage whether or not the request is for asp or html) this help from Apache. It is a bit deep, but you appear to want to do it yourself and this is a resource I would suggest.
If you look about a third down the page there is a content box that covers tons of variables.
Best,
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important category pages that can and should be found in SERP but can not be reached by navigating on the webshop itself
Hi, On a webshop we are optimizing, the main navigation consists of the 5 main categories to which all of the products can be assigned. However, the main tabs in the navigation just activate a drop down with all of the subcategories. For example: the tab in the navigation is 'Garden equipment' and when you click on this tab, the drop down is shown with subcategories like 'Lawn mowers', 'Leaf blowers' and so on. Now, the page 'Garden equipment' is one of the main category pages and we want this page to rank of course. This shouldn't be a problem, since there is a separate URL for this page that can be indexed and that can be reached through internal links on the website. However, this page can not be reached when a visitor initially comes on the homepage of the webshop, since the tab in the navigation isn't clickable. This page will only be reached when a subcategory is selected, and then when the visitor goes back to the category page through the breadcrumb or through an internal link. Is it a problem that these important overview category pages can not be reached immediately? Thanks.
Intermediate & Advanced SEO | | Mat_C0 -
Is it worth redirecting?
Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?
Intermediate & Advanced SEO | | dieselprogrammers0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
301 Redirecting an Entire Site
I have a question which has had me thinking for hours..... If SITE A is ranking well on a number of search phrases and you 301 that site to another (SITE B). The site will change on the Google SERPs to the site which you've re-directed to... In this case SITE B. But how do you maintain the rankings of SITE A?. Do you keep the rankings of SITE A forever? Or will your rankings of SITE A (now SITE B) gradually slip as other sites rank higher? As you can no longer edit SITE A does Google take into consideration the content on SITE B and no longer take anything that SITE A had to offer into consideration? SITE B has simply replaced it in the SERPs??...... Please can anybody help? Thanks,
Intermediate & Advanced SEO | | karl620 -
How to decide on which site to 301 redirect
Hi there I'd like your opinions please! My client currently has their website at not-very-good-url.it which has a really good link profile they also have duplicate sites at: much-better-brand-name-url.it and much-better-brand-name-url.com but both these other sites have only a handful of links in. How important do you think a better brand url is? And therefore do you think it would be better to 301 to a better brand URL and take the risk that the link profile will get hit? Or leave the main site where it is and 301 the other two to it? Many thanks
Intermediate & Advanced SEO | | Chammy0 -
301 - do i change old links once 301 is in place?
Hey all, I'm about to setup a 301 on a website that has pretty good SEO rankings and I have the ability to change all the old inbound links that point to the old site, to the new site - should I leave them pointing to the old site that has the 301 on it or change all the old inbound links to the new domain name? Which has better SEO value? Thanks for helping, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Canonical, 301 or code a workaround?
Hi, Recently I've been trying to tackle an issue on one of my websites. I have a site with around 400 products and 550 pages total. I've been pruning some weaker pages and pages with shallow content, and it's been working really well. My current issue is this: There are about 20 store brands of 6 products on my site that each have their own page. They are identical products just re-branded. Writing content for each of these pages has been difficult, as it's a fairly dry product too. So I have around 120 pages of dry content that is unique but not much different from one another. I want to consolidate but I am not sure how yet. Here is what I am thinking: 1. 301 - I pick one product page as the master, 301 all the other duplicate products to it and then make one page of great content that encompasses all of them. If the 301 juice gets diluted over time I might miss out on some long tails, but I could also gain a lot more from a great content page with 500+ words of really good content as opposed to pages with 150-250 words of just so so content. 2. Canonical - Similar to above. I pick a master page and canonical the other pages to it. Then I could use the great content on all the pages, and still have pages for the specific products. The pages might not show up in search engines but would still be searchable on my site. 3. Coded solution - In my CMS I could always make a workaround where the products still appear on the brands page (just their name with a link to the product page) but all the links direct to a master page. I realize all the solutions are fairly similar, although I am not sure which is ideal. Option 3 is the most expensive/time consuming but it would drop my page total down to around 450 pages. For a while now (dating back to before Panda) I've been trying to get rid of the low quality and outdated product pages so I could focus on the more popular and active pages. Dropping my page total would also help in the SEO efforts as the sheer volume of pages that need links right now is high, and obviously the less pages I have the more time I can spend on each page (content and link building). So what do you think? Should I do any of the 3, a combination of the 3 or something different? Cheers, Vinnie
Intermediate & Advanced SEO | | vforvinnie0