How to Disallow Specific Folders and Sub Folders for Crawling?
-
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google.
I aware about use of robots.txt file and can disallow images folder to solve this issue.
But, It may block my images to get appear in Google image search.
So, How can I fix this issue?
-
You can, but then the content will be removed from Google's index for 90 days. I am not sure what effect this would have on pages with the images. It shouldn't have any effect, but I would hate for you to have rankings in any way affected for 90 days.
I have no experience in having images indexed in this manner. Perhaps someone else has more knowledge to share on this topic.
-
Can I use Remove URL facility from Google webmaster tools?
-
I checked your URL: http://www.lampslightingandmore.com/images/. The folder is now properly restricted and the images can no longer be seen using this method. Going forward, Google will not be able to index new images in the same manner your other images were indexed.
With respect to the images which have been indexed, I am not certain how Google will respond. The image links are still valid so they may keep them. On the other hand, the links are gone so they may remove them. If it were my site, I would wait 30 days to see if Google removed the results.
Another way you can resolve the issue is to change the file path to your images from /images to /image. This will immediately break all the links. You would need to ensure all the links on your site are updated properly. It still may take Google a month to de-index those results but it would certainly happen in that case.
-
I have added Options -Indexes for images folder in htaccess file.
But, I still able to find out images folder in Google indexing.
Can I check? Is it working properly or not? I don't want to index or display images folder in web search any more.
-
I am going to add following code to my htaccess page.
Options -Indexes
Will it work for me or not?
-
If you have a development team, they should instantly understand the problem.
A simple e-mail to any developer
E-mail title: Please fix
http://www.lampslightingandmore.com/images/
That's it. No other text should be needed. A developer should be able to look at the page and understand the index was left open and how to fix it. If you wish to be nicer then a simple "my index is open for the world to see, please don't allow public access to my server folders" should suffice.
-
Yes, I have similar problem with my code structure. Yesterday, I have set Relative path for all URLs. But, I am not sure about replacing of image name in code after make change in folder.
So, I don't want to go with that manner. I also discussed with my development team and recommend to go with htaccess method.
But, give me caution to follow specific method otherwise it may create big issue for crawling or indexing. Right??
-
The link you shared is perfect. Near the top there is a link for OPTIONS. Click on it and you will be on this page: http://httpd.apache.org/docs/1.3/mod/core.html#options
I want to very clearly state you should not make changes to your .htaccess file unless you are comfortable working with code. The slightest mistake and your entire site becomes unavailable. You can also damage the security of your site.
With that said, if you decide to proceed anyway you can add the text I shared to the top of your .htaccess file. You definitely should BACK UP the file before making any changes.
The suggestion vishalkialani made was to rename your /images folder to something else, perhaps /image. The problem is that if your site was not dynamically coded, you would break your image links.
-
In addition to what Ryan mentioned I would rename that folder on your server. That will make google's index outdated and you won't get any visitors on the server
-
I can't getting you.
-
also you can rename it so when google 's index shows up the results you won't get any hits.
if thats what you want.
-
Yes, I checked article to know more about it.
http://httpd.apache.org/docs/1.3/howto/htaccess.html
But, I am not able to find my solution. Can you suggest me specific article which suppose to help me more in same direction?
-
Hello.
You have left your site open in a manner which is not recommended. Please take a look at the following URL: http://www.lampslightingandmore.com/images/. On a properly secured server, you should receive a 404 Page Not Found or Access Denied type of error. Since the folder is left open, a Google crawler found it and you are seeing the results.
The means to secure your site varies based on your software configuration. If you are on an Apache web server (the most common setup) then these settings are controlled by your htaccess file. I am not an htaccess expert but I believe adding the following code to your .htaccess file at the top will fix the issue:
Options -Indexes
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you bother disallowing low quality links with brand/non-commercial anchor text?
Hi Guys, Doing a link audit and have come across lots of low quality web directories pointing to the website. Most of the anchor text of these directories are the websites URL and not comercial/keyword focused anchor text. So if thats the case should we even bother doing a link removal request via google webmaster tools for these links, as the anchor text is non-commercial? Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Mobile friendly - Sub domain after responsive?
My client has a mobile website (m.website.com) and is working on a responsive website. The site is already labeled "Mobile friendly" on the SERP and I was wondering if, when the site will be responsive, it would be worth it to redirect the subdomain or it would be safer to keep the subdomain and still redirect mobile users to it since we're already Mobile friendly (we would keep the rel alternate on the main domain and the rel canonical on mobile).
Intermediate & Advanced SEO | | Digitics0 -
Sub Domain
Hi everybody, My competition has started to use the sub-domains vastly. He has created one sub domain for every single city and keyword. Is it something that I should be worried of? Is it a good idea I start doing the same thing? Thanks for your help.
Intermediate & Advanced SEO | | Armin6660 -
What is the value of Google Crawling Dynamic URLS with NO SEO
Hi All I am Working on travel site for client where there are 1000's of product listing pages that are dynamically created. These pages are not SEO optimised and are just lists of products with no content other than the product details. There are no meta tags for title and description on the listings pages. You then click Find Out more to go to the full product details. There is no way to SEO these Dynamic pages This main product details has no content other than details and now meta tags. To help increase my google rankings for the rest of the site which is search optimised would it be better to block google from indexing these pages. Are these pages hurting my ability to improve rankings if my SEO of the content pages has been done to a good level with good unique Titles, descriptions and useful content thanks In advance John
Intermediate & Advanced SEO | | ingageseo0 -
Google Crawl Rate and Cached version - not updated yet :(
Hi, Ive noticed that Google is not recognizing/crawling the latest changes on pages in my site - last update when viewing Cached version in Google Results is over 2 months ago. So, do I Fetch as Googlebot to force an update? Or do I remove the page's cached version in GWT remove urls? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Why won't my sub-domain blog rank for my brand name in Google?
For six months or so, my team and I have been trying to get our blog to rank on page one in Google for the term "Instabill." The URL, http://blog.instabill.com, is a sub-domain of our company website and they both use the same IP address. Three pages on our www.Instabill.com site rank in the top three spots when searching our brand name in Google. However, our blog ranks 100+. For our blog, we are currently using b2evolution and nginx. We have tried adding static content on the home page, static content in the sidebar, static content on an About Instabill page, and optimizing blog posts for the keyword Instabill, but nothing seems to work. We appreciate any advice you can provide to us. Thank you!
Intermediate & Advanced SEO | | Instabill
Meghan0 -
Sub-domains and different languages
Hi there! All our content is in two languages: English and Spanish, but they're basically the same (sometimes longer, sometimes shorter). We have the English content under a subdomain (en.mydomain.com) and the Spanish one under another subdomain (es.mydomain.com). First of all: is that correct? Is it better to have it under folders or under subdomains? But the most important question. When a user enters to mydomain.com is redirected through a 302 to the Spanish subdomain or to the English subdomain, depending on the language of his browser (microsoft.com works this way). We have now a lot of links pointing to mydomain.com but... where is all this link flow going?? Are we losing it? Should we have a landing page under mydomain.com pointing to both subdomains? or maybe redirect it through a 301 to just one of the subdomains, then redirect the user to his language if necessary? Thank you very much!!!
Intermediate & Advanced SEO | | bodaclick0