Htaccess file
-
I need to redirect the web pages which do not exist to 404 error
the task need to be done in htaccess file.
I am using Linux server.
the webpages I want to redirect is my domain name followed by question mark
I am using the following snippet in my htaccess file, it redirect to bing.com so far,
please tell me how to change the snippet so that it redirect to redirect to 404 error page.
==========================
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
-
Use:
ErrorDocument 404 /404.html
where 404.html is your error document page! If the page does not exist then it should return a 404 status code and then automatically redirect to this page.
-
The reason your URL is redirecting to bing.com is you haven't changed the URL from the snippet you found to the URL you actually want to use as your 404 Error Page.
You can create a 404 Error Page and place it on your server. If you have already done this you need to change the URL (http://www.bing.com) in the snippet below to the URL you want to use (e.g. http://www.[yourdomainname].com/404.html).
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
That should look more like this:
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.[yourdomainname].com/404.html [L,R=301]
-
I am not completely sure I get the question -
two ways I interpreted - the easiest would just be the redirect /? to 404
which would be
redirect 301 /? http://www.404pagehere.com
If you are trying to do a wildcard redirect that is a little harder
RedirectMatch 301 /?/(.*)$ http://www.404pagehere.com
I think the easiest might be
RewriteEngine On
RewriteRule ^foldername/* http://www.pagehere.com/ [R=301,L]Make sure and backup .htaccess before making any changes, as I am not sure what I gave you will fix your specific issue.
Shane
-
This should help you:
http://webdesignandsuch.com/create-a-htaccess-file-to-redirect-to-a-404-html-page/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to no index / no follow CAD files .dxf .dwg
Hi, I have a new Wordpress site with a number of CAD files (.dxf& .dwg) downloadable straight from the site. These have been flagged in MOZ as warnings with everying from No Title/Description to duplicate content. Does anybody now how I would no index these type of files? Many thanks.
Technical SEO | | Jon_Pearce0 -
.htaccess Question
Hi,I have a website www.contractor-accounts.co.uk that has an .htaccess file that strips .php and forces a closing brace /. The site is now over 6 months old and still has a very low ranking with MOZ also rating the site as DA/PA = 1 which seems to indicate some sort of issue with the website. Can anyone offer any suggestions as to why this site is ranking poorly as much of the onpage SEO has been completed to a level of 90%+ for specific keyterms so I'm probably either looking at routing of the framework of so other technical SEO issues possibly? Any help much apreciated... <ifmodule mod_rewrite.c=""><ifmodule mod_negotiation.c="">Options -MultiViews</ifmodule> RewriteEngine On # Redirect Trailing Slashes...
Technical SEO | | ecrmeuro
# RewriteRule ^(.)/$ /$1 [L,R=301]
RewriteCond %{REQUEST_URI} /+[^.]+$
RewriteRule ^(.+[^/])$ %{REQUEST_URI}/ [R=301,L]
# Redirect non-WWW to WWW...
RewriteCond %{HTTP_HOST} ^contractor-accounts.co.uk [NC]
RewriteRule ^(.)$ http://www.contractor-accounts.co.uk/$1 [L,R=301] # Handle Front Controller...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]</ifmodule>0 -
Robots file set up
The robots file looks like it has been set up in a very messy way.
Technical SEO | | mcwork
I understand the # will comment out a line, does this mean the sitemap would
not be picked up?
Disallow: /js/ should this be allowed like /*.js$
Disallow: /media/wysiwyg/ - this seems to be causing alerts in webmaster tools as it can not access
the images within.
Can anyone help me clean this up please #Sitemap: https://examplesite.com/sitemap.xml Crawlers Setup User-agent: *
Crawl-delay: 10 Allowable Index Mind that Allow is not an official standard Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Allow: /catalogsearch/result/ Allow: /media/catalog/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /errors/
Disallow: /includes/
Disallow: /js/
Disallow: /lib/
Disallow: /magento/ Disallow: /media/ Disallow: /media/captcha/ Disallow: /media/catalog/ #Disallow: /media/css/
#Disallow: /media/css_secure/
Disallow: /media/customer/
Disallow: /media/dhl/
Disallow: /media/downloadable/
Disallow: /media/import/
#Disallow: /media/js/
Disallow: /media/pdf/
Disallow: /media/sales/
Disallow: /media/tmp/
Disallow: /media/wysiwyg/
Disallow: /media/xmlconnect/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /scripts/
Disallow: /shell/
#Disallow: /skin/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalog/product/gallery/
Disallow: */catalog/product/upload/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt
Disallow: /get.php # Magento 1.5+ Paths (no clean URLs) #Disallow: /.js$
#Disallow: /.css$
Disallow: /.php$
Disallow: /?SID=
Disallow: /rss*
Disallow: /*PHPSESSID Disallow: /:
Disallow: /😘 User-agent: Fatbot
Disallow: / User-agent: TwengaBot-2.0
Disallow: /0 -
How to add specific Tumblr blogs into a disavow file?
Hi guys, I am about to send a reconsideration letter and still finalizing my disavow file. The format of the disavow is Domain:badlink.com (stripping out to the root domain) but what about those toxic links that are located in tumblr such as: badlink.tumblr.com? The issue is that there are good tumblr links we got so I don't want to add just tumblr.com so do you guys think I will have issues submitting badlink.tumblr.com and not tumblr.com? Thank you!
Technical SEO | | Ideas-Money-Art0 -
My .htaccess has changed, what do i do to avoid it again...?
Hello Today i notice that our site did not auto changed from without www to with, when i checked the .htaccess file i notice # in-front of each line and i know we did not insert it in there, after i removed it it worked fine. The only changes that we did recently was to a mobile version to the site but the call to autodirect is in a JS and not in the .htaccess, could it be the server..? is there any way that anything else might cause this...? The site is HTML and WP could it be because of that...? Thank's Simo
Technical SEO | | Yonnir0 -
Limits to 301 in htaccess?
I'm about to launch a redesign of my company's main website, and we've updated most of the URLs to be more user friendly and SEO optimized. I've just finished editing my spreadsheet, and see that I'll need to implement 244 redirects. My question is: Are there performance issues with loading your .htaccess file up with almost 250 301 redirect commands? I've heard a bloated htaccess file can really slow down apache, should I be approaching this a different way, maybe with php?
Technical SEO | | AdoptionHelp0 -
.htaccess file in wordpress blog
I want to redirect non www to www in blog hosted by wordpress. Where can i find .htaccess file ? Shall i have to create a new one ? If yes, where should i upload it ? Thanks
Technical SEO | | seoug_20050 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0