Remove a directory using htaccess
-
Hi,
Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed?
Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code.
Thanks.
-
<colgroup><col width="205"> <col span="14" width="64"></colgroup>
| Hello webtarget, First of all I want to tell you about the .htaccess file, It is the file which control the Apache webserver, is very useful and allows you to do a lot of things. |
| Here the below is the link which solves your whole problem regarding .htaccess file. Please check & review it. | | | | | | |
| http://www.catswhocode.com/blog/10-awesome-htaccess-hacks-for-wordpress | | | | | | | | | | -
That's great thanks.
-
Hi
This should solve your problem
<code>RewriteEngine on
RewriteCond %{REQUEST_URI} ./folder/.
RewriteRule (.*) http://www.yourdomain.com/gone.php [R=410,L]</code>Replace "folder" with the name of your folder and instead of a
<code>http://www.yourdomain.com/gone.php</code>
put link towards the page informing visitors that request resource is gone.
Kind regards
Bojan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
Using / at the end of anchor text link
Hello I am building backlinks to home page my question is should I use "/" without qoute after my domain name or not example for both cases is below keyword here / at the end of http://abc.com is not place or keyword here / at the end of http://abc.com/ is place. waiting
Technical SEO | | tanveerayakhan0 -
Is it ok to use H1 tags in breadcrumbs?
A client has an e-commerce site and she doesn't want a page title on the products page. She has breadcrumbs though. Her website developer suggests putting the H1 on the breadcrumbs. So: products> Gifts > picture frame with h1 tags round the word "picture frame". Is this ok to do? Or is it a bad thing for SEO purposes? Thanks
Technical SEO | | AL123al0 -
Effective use of hReview
Hi fellow Mozzers! I am just in the process of adding various reviews to our site (a design agency), but I wanted to use the ratings in different ways depending on the page. So for the home page and the services (branding, POS, direct mail etc) I wanted to aggregate relevant reviews (giving us an average of all reviews for the home page, an average of ratings from all brand projects and so on). Then, I wanted to put specific reviews on our portfolio pages, so the review relates specifically to that project. This is the easiest to do as the hReview generator is geared up for reviews that come from one source, but I can't find a way of aggregating the star ratings to make an average rating rich snippet. Anyone know where I can get the coding for this? Thanks in advance! Nick.
Technical SEO | | themegroup0 -
Extra Sub Directory
Anything wrong with a URL structure like: www.mysite.com/process/widgets/red-widgets Where the DIR: /process/ is completely empty e.g. you get a 404 if you go to www.mysite.com/process/ and it has no content within. This URL structure was setup before they knew what SEO was...wondering if it's worth the pain the 301 and restructure new URLs or is it ok to leave as is?
Technical SEO | | SoulSurfer80 -
Removing Out of Stock Items from an E-Commerce website
I have a dilemma. We have over 500 out of stock items that are still listed on our ecommerce website. I'm thinking it would be a good idea to leave them up because they are all considered content by google, and the keywords might drive traffic. On the other hand, the customers might be disappointed if the items are out of stock (we don't restock our sold out items), and many times, they will not lead to a conversion if the customer is looking for something very specific. Considering all these factors (and some unmentioned ones), my main question is: If I remove content, does that make all of the other content on our website stronger by having more pagerank and link juice flow to them, or do I hurt our rankings?
Technical SEO | | 13375auc30 -
How to handle URL's from removed products?
Hi All, I have a question about a fashion related webshop. Every month about 100 articles are removed and about the some amouth is added to the site. Most of the products are indexed on brandname and type (e.g. MyBrand t-shirt blue) My question is what to do with the URL / page after the product is removed. I'm thinking about a couple of solutions: 301 the page to the brand categorie page build a script which shows related articles on the old URL (and try to keep it indexed) 404 page optimized for search term with links to brand category any other suggestons? Thanks in advance, Sam
Technical SEO | | U-Digital0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0