Needs clarification: How "Disallow: /" works?
-
Hi all,
I need clarification on this. I have noticed we have given "Disallow: /" in one of our sub-directory beside homepage. So, how it going to work now? Will this "Disallow: /" at sub-directory level going to disallow only that directory or entire website?
If it is going to work for entire website; we have already given one more Disallow: / at homepage level blocking few folders. How it is going to handle with two Disallow: / commands?
Thanks
-
Hi vtmoz,
You've received some great responses! Did any of them help answer your question? If so, please mark one or more as a "good answer." And if not, please let us know how we can help. Thanks!
Christy
-
If you have concerns, I strongly recommend using Google Search Console to test URL use cases against your existing robots.txt file and before you do potential edits.
-
The directive that is literally "Disallow: /" will prevent crawling of all pages on your site, since technically, all page paths begin with a slash. Robots.txt files can only live at the root folder (subdirectory) of a site, so if you want to disallow a folder, you'll need to specify that with a directive like "Disallow: /folder-name/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need to interlink homepage to the highest?
Hi, So our homepage is not even at top 5 pages interlinked most. I could see a ranking drop even when other pages overtaken homepage in internal linking. Homepage is the highest priority page we are expected to rank for our primary keyword. So, not linking homepage internally high but giving importance to other pages confuses Google and outrank homepage in rankings? Thanks
Web Design | | vtmoz0 -
My news site not showing in "In the news" list on Google Web Search
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools. News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks. Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list. Below is a screenshot that I have attached to this question to help you understand what I mean to say. 1qoArRs
Web Design | | hakhan2010 -
Is it still necessary to have a "home" page button/link in the top nav?
Or is it not necessary to have a "home" tab/link because everybody by this time knows you can get to the home page by clicking on the logo?
Web Design | | FindLaw0 -
Need An Honest Opinion Of My Design
Just looking to get an honest opinion on my website design for my scuba diving client. Trying to decrease bounce rate and have seen some results from tweaking design. Honest opinions appreciated. Recommendations appreciated even more 😉
Web Design | | InfinityTechnologySolutions0 -
Links not visible in "Google cache text version" but visible in "Fetch as Google" in Webmaster tool
Hi Guys, There seems some issue with the coding due to which Google is not indexing half of our menu bar links. The cached text version of http://www.99acres.com/ is not showing links present in dropdown "All India" , dropdown "Advice" and "Hot Projects" tab in blue bar on top menu whereas these links are visible in "Fetch as Google" in Google Webmaster tool. Any clue to why is there a difference between the links shown in Google webmaster and Google cache text version. Thanks in advance 🙂
Web Design | | vivekrathore0 -
WordPress not man enough...has anybody got experience working with Pryo CMS?
Hey folks I'm working with a small team on putting together a new niche accommodation / holiday search portal here in the UK. We are most likely using PHP / MySQL technology for the site - I am a huge fan of WordPress but not sure its quite man enough for the task (many option search over 10,000 plus properties). We can't afford to pay for a bespoke development, so off-the shelf CMS is the most likely route for release 1, and from what I've been reading Pyro CMS seems a good open source choice... https://www.pyrocms.com/ Has anybody come across this, or know how good it is with regards to on-site SEO? Or maybe WordPress is up to the task? If not, what are other good open source options for sites focused around a search function? Cheers Simon
Web Design | | SCL-SEO0 -
XML Sitemap that updates daily/weekly?
Hi, I have a sitemap on my site, that updates but it isn't a XML sitemap. See here: http://www.designerboutique-online.com/sitemap/ I have used some free software to crawl the site and create a sitemap of pages, however I think that if I were to upload the sitemap, it would be out of date as soon as I listed new products on the site, so would need to rerun it. Does anyone know how I can get this to refresh daily or weekly? Or any software that can do it? I have a web firm that are willing to do one, but our relationship is at an all time low and I don't want to hand over £200 for them to do one. Anyone with any ideas or advice? Thanks Will
Web Design | | WillBlackburn0 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0