Htaccess file
-
I need to redirect the web pages which do not exist to 404 error
the task need to be done in htaccess file.
I am using Linux server.
the webpages I want to redirect is my domain name followed by question mark
I am using the following snippet in my htaccess file, it redirect to bing.com so far,
please tell me how to change the snippet so that it redirect to redirect to 404 error page.
==========================
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
-
Use:
ErrorDocument 404 /404.html
where 404.html is your error document page! If the page does not exist then it should return a 404 status code and then automatically redirect to this page.
-
The reason your URL is redirecting to bing.com is you haven't changed the URL from the snippet you found to the URL you actually want to use as your 404 Error Page.
You can create a 404 Error Page and place it on your server. If you have already done this you need to change the URL (http://www.bing.com) in the snippet below to the URL you want to use (e.g. http://www.[yourdomainname].com/404.html).
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
That should look more like this:
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.[yourdomainname].com/404.html [L,R=301]
-
I am not completely sure I get the question -
two ways I interpreted - the easiest would just be the redirect /? to 404
which would be
redirect 301 /? http://www.404pagehere.com
If you are trying to do a wildcard redirect that is a little harder
RedirectMatch 301 /?/(.*)$ http://www.404pagehere.com
I think the easiest might be
RewriteEngine On
RewriteRule ^foldername/* http://www.pagehere.com/ [R=301,L]Make sure and backup .htaccess before making any changes, as I am not sure what I gave you will fix your specific issue.
Shane
-
This should help you:
http://webdesignandsuch.com/create-a-htaccess-file-to-redirect-to-a-404-html-page/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good practice to update your disavow file after a penalty is removed.
I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done? (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.
Technical SEO | | podweb0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
301 redirect .htaccess
Hi guys I am working on some 301 redirects on an apache webserver and I'd like a bit of assistance in trying to get a specific type result: I want all addresses from domaina.com to be redirected to domainb.com in the same structure so domaina.com/folder/file will go to domainb.com/folder/file expect for 2 folders.
Technical SEO | | seobackbone
ie: DomainA.com --> DomainB.com
except domainA.com/folder1
and domainB.com/folder2 Can someone let me know how I can pull this off?0 -
Combining 2 blogs into one. What is quicker, easier and better - rel canonical or an htaccess/ 301?
The objective I have is to archive an entire blog (which I no longer have time to keep up) with multiple posts over 4years , into another blog as a a folder. My question: would it be quicker and easier to do a rel canonical, or separately list all pages in htaccess and do a 301 redirect.
Technical SEO | | charlesgrimm0 -
Blocked by meta-robots but there is no robots file
OK, I'm a little frustred here. I've waited a week for the next weekly index to take place after changing the privacy setting in a wordpress website so Google can index, but I still got the same problem. Blocked by meta-robots, no index, no follow. But I do not see a robot file anywhere and the privacy setting in this Wordpress site is set to allow search engines to index this site. Website is www.marketalert.ca What am I missing here? Why can't I index the rest of the website and is there a faster way to test this rather than wait another week just to find out it didn't work again?
Technical SEO | | Twinbytes0 -
High pr doc files
I saw that the website www.comunicatedepresa.net outranks www.comunicatedepresa.ro for the therm "comunicate de presa" in google.ro SERP even though .ro beats .net in every seo indicator (links, domains linking, fb likes, g+, onpage etc) I saw that site:www.comunicatedepresa.net returns a lot of *.doc files with a title that contains the kw ("comunicate de presa"). Ex: www.comunicatedepresa.net/worddoc/1485/ It seems a little suspicios to me.Did anyone see this before (google giving higher importance to doc files)? Does anyone know why .net site is ranking better?
Technical SEO | | seo.academy0 -
Keywords in file names vs folder names
We understand the value of a keyword phrase included in the URL. Is there more value to having that phrase in the folder name of the URL or the file name or does it matter? Example: http://www.biztoolsone.com/website-design.php or http://www.biztoolsone.com/website-design/ Which is best? Thanks, Wick Smith
Technical SEO | | wcksmith0 -
How long does it take for customized Google Site Search to show results from pdf files?
The site in question is http://www.ejmh.eu I am pretty unsatisfied with the results I am getting from the Site Search provided by Google. We have over 160 pdf files in this subfolder: http://www.ejmh.eu/mellekletek The files are the digital versions of articles. When I search for content in those pdf files, Google does not show results. It does show results from older pages, dating back 1-2 years but it is certainly not showing anything from pdf files that I have just put up 3 weeks ago. My questions: If I place a Google Search on a site, does it not automatically display results from ALL the content in the root domain? Is there any correlation between how the Site Search is indexing the files and how Google is indexing the urls in general? Should I just wait and see whether site search performance improves or should I switch to another Search software like Zoom Search? It is vital to have a proper, high-quality search functioning on that site in the very near future. What are your experiences? Any tips are greatly appreciated.
Technical SEO | | Lauroca0