HTTP Status Bad Request - 404, but also, add a 400 HTTP Status in certain circumstances?
-
We currently have a custom 404 page set up for our clients, but the developer has it returning a HTTP 200 for the status code. Big no, no. I'm having that fixed right now.
My question is, currently, the custom 404 page is only returned for urls with the extension .aspx:
- For example : ilovepizza.com/pepperni.aspx would return a 404 page because the correct page is ilovepizza.com/pepperoni.aspx
Any other format of URL without the extension (example ilovepizza.com/thumbtack) does not trigger the custom 404 page we've created, but it does trigger a server error with a 404 HTTP status page. I want to change this so this type of error also triggers the custom 404 page because it's more user-friendly and would return them to the website.
My question: Is there any benefit to making the /thumbtack errors return the custom 404 page but with a 400 Bad Request HTTP Status?
Kind of a novice here in those aspects, but does the 400 Bad Request status indicate that it was a user mistake and not a mistake created on the website?
Other suggestions?
-
Unfortunately I don't have experience with that.
-
Hi Scott,
Thanks for the reply. The developers will take care of that part. Have you ever heard of anyone using 400 instead of 404 when it's more of a user-generated error?
-EEE3
-
Have you tried adding this 404 page to IIS -> Website properties -> Custom Errors tab, then under 404, point to the web page that you would like to show up when a file is not found?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 vs 410 Across Search Engines
We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks
Intermediate & Advanced SEO | | sb10300 -
All urls seem to exist (no 404 errors) but they don't.
Hello I am doing a SEO auditing for a website which only has a few pages. I have no cPanel credentials, no FTP no Wordpress admin account, just watching it from the outside. The site works, the Moz crawler didn't report any problem, I can reach every page from the menu. The problem is that - except for the few actual pages - no matter what you type after the domain name, you always reach the home page and don't get any 404 error. I.E. Http://domain.com/oiuxyxyzbpoyob/ (there is no such a page, but i don't get 404 error, the home is displayed and the url in the browser remains Http://domain.com/oiubpoyob/, so it's not a 301 redirect). Http://domain.com/WhatEverYouType/ (same) Could this be an important SEO issue (i.e. resulting in infinite amount of duplicate content pages )? Do you think I should require the owner to prevent this from happening? Should I look into the .htaccess file to fix it ? Thank you Mozers!
Intermediate & Advanced SEO | | DoMiSoL0 -
Can I add external links to my sitemap?
Hi, I'm integrating with a service that adds 3rd-party images/videos (owned by them, hosted on their server) to my site. For instance, the service might have tons of pictures/videos of cars; and then when I integrate, I can show my users these pictures/videos about cars I might be selling. But I'm wondering how to build out the sitemap--I would like to include reference to these images/videos, so Google knows I'm using lots of multimedia. How's the most white-hat way to do that? Can I add external links to my sitemap pointing to these images/videos hosted on a different server, or is that frowned upon? Thanks in advance.
Intermediate & Advanced SEO | | SEOdub0 -
HTTP to HTTPS Question
Hello, I have a question regarding SSL Certificates I think I know the answer to but wanted to make sure. One of our clients’ site uses http for their pages but when they started creating Registration forms they created a full duplicate site on https (so now there are two versions of all of the pages). I know due to duplicate concerns this could be an issue and needs to resolved (as well as the pros and cons of both) but if they are already set up with https does it make sense to just move everything there or in some instances would it pay to keep some pages http (using canonical tags, redirects, htccess…etc)? – Most of the information I found related to making the decision prior to having both or describing the process but I couldn’t find anything that specifically related to if both are already present. I thought that the best approach because everything’s already set up is to just move everything over to the more secure one but was curious if anybody had any insight? Thank you in advance.
Intermediate & Advanced SEO | | Ben-R0 -
Http to https conversion - what were your experiences?
Background: Our devs have been talking about changing some of our websites so that all pages are https vs just those that are a part of our logins and shopping carts. From what I have read things that need to be done as a part of this are make sure that https pages will allow caching setup new site in GWT put 301 redirects in place update all internal links, social profiles etc everywhere you can to https URLs can server handle the extra load so no impact on site speed There is an old MCutts video that says essentially, "works for Pay Pal" and "you can try it, but test it on a smaller site first" http://www.youtube.com/watch?v=xeFo4ytOk8M The comments below the video are all stories about loss in rank and traffic with some coming back. Not sure if these folks did the move correctly, but still, you never know. Question: Have any of you done a technically "correct" move of an entire site from http to https using the suggestions above? What was your experience? Any gotchas? Just to be clear, I am not talking about setting up a site from scratch, but wanted to know impact on an established site. Thx
Intermediate & Advanced SEO | | CleverPhD1 -
Killing 404 errors on our site in Google's index
Having moved a site across to Magento, obviously re-directs were a large part of that, ensuring all the old products and categories linked up correctly with the new site structure. However, we came up against an issue where we needed to add, delete, then re-add products. This, coupled with a misunderstanding of the csv upload processing, meant that although the old urls redirected, some of the new Magento urls changed and then didn't redirect: For Example: mysite/product would get deleted re-added and become: mysite/product-1324 We now know what we did wrong to ensure it doesn't continue to happen if we weret o delete and re-add a product, but Google contains all these old URLs in its index which has caused people to search for products on Google, click through, then land on the 404 page - far from ideal. We kind of assumed, with continual updating of sitemaps and time, that Google would realise and update the URL accordingly. But this hasn't happened - we are still getting plenty of 404 errors on certain product searches (These aren't appearing in SEOmoz, there are no links to the old URL on the site, only Google, as the index contains the old URL). Aside from going through and finding the products affected (no easy task), and setting up redirects for each one, is there any way we can tell Google 'These URLs are no longer a thing, forget them and move on, let's make a fresh start and Happy New Year'?
Intermediate & Advanced SEO | | seanmccauley0 -
redirect 404 pages to homepage
Hello, I'm puting a new website on a existing domain. In order to not loose the links that point to the varios old url I would like to redirect them to homepage. The old website was a mess as there was no seo and the pages didn't target any keywords. Thats why I would like to redirect all links to home. What do you think is the best way to do this ? I tried to ad this in the .htaccess but it's not working; ErrorDocument 404 /index.php Con you tell me how it exacly look? Now the hole file is like this: @package Joomla @copyright Copyright (C) 2005 - 2012 Open Source Matters. All rights reserved. @license GNU General Public License version 2 or later; see LICENSE.txt READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! The line just below this section: 'Options +FollowSymLinks' may cause problems with some server configurations. It is required for use of mod_rewrite, but may already be set by your server administrator in a way that dissallows changing it in your .htaccess file. If using it causes your server to error out, comment it out (add # to beginning of line), reload your site in your browser and test your sef url's. If they work, it has been set by your server administrator and you do not need it set here. Can be commented out if causes errors, see notes above. Options +FollowSymLinks Mod_rewrite in use. RewriteEngine On Begin - Rewrite rules to block out some common exploits. If you experience problems on your site block out the operations listed below This attempts to block the most common type of exploit attempts to Joomla! Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR] Block out any script that includes a
Intermediate & Advanced SEO | | igrizo0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0