Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Turn SEO data into actionable content briefs

      Turn SEO data into actionable content briefs

      Learn more
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      Let your business shine with Listings AI

      Let your business shine with Listings AI

      Get found
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Access 20 years of data with flexible pricing
      Moz API

      Access 20 years of data with flexible pricing

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Technical SEO
    4. Robot.txt : How to block a specific file type in several subdirectories ?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Robot.txt : How to block a specific file type in several subdirectories ?

    Technical SEO
    2
    3
    1980
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • LabeliumUSA
      LabeliumUSA last edited by

      Hello everyone !

      I need help setting up a robot.txt.

      I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site.

      Block files of a specific file type (for example, .gif) | Disallow: /*.gif$

      2 questions :

      • Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ?

      Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$

      • Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files.

      Let's say I want to block pdf files in all these 3 directories

      /fileadmin/directory1

      /fileadmin/directory1/sub1

      /fileadmin/directory1/sub1/pdf

      Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple :

      Disallow: /fileadmin/directory1*/

      Many thanks in advance for any insight you may have.

      1 Reply Last reply Reply Quote 0
      • LabeliumUSA
        LabeliumUSA @Rajesh.Prajapati last edited by

        Hey thank you for your answer, really appreciate it.

        1 Reply Last reply Reply Quote 0
        • Rajesh.Prajapati
          Rajesh.Prajapati last edited by

          Use this code -
          Disallow: /*.f$
          If you want to block only one folder then use this -
          Disallow: /folder1/
          .*f$
          This rule will help to block both files only .pdf and .gif

          LabeliumUSA 1 Reply Last reply Reply Quote 1
          • 1 / 1
          • First post
            Last post

          Got a burning SEO question?

          Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


          Start my free trial


          Browse Questions

          Explore more categories

          • Moz Tools

            Chat with the community about the Moz tools.

          • SEO Tactics

            Discuss the SEO process with fellow marketers

          • Community

            Discuss industry events, jobs, and news!

          • Digital Marketing

            Chat about tactics outside of SEO

          • Research & Trends

            Dive into research and trends in the search industry.

          • Support

            Connect on product support and feature requests.

          • See all categories

          Related Questions

          • Nomader

            Crawl solutions for landing pages that don't contain a robots.txt file?

            My site (www.nomader.com) is currently built on Instapage, which does not offer the ability to add a robots.txt file. I plan to migrate to a Shopify site in the coming months, but for now the Instapage site is my primary website. In the interim, would you suggest that I manually request a Google crawl through the search console tool? If so, how often? Any other suggestions for countering this Meta Noindex issue?

            Technical SEO | | Nomader
            1
          • ey_sja

            How Does Dynamic Content for a Specific URL Impact SEO?

            Example URL:  http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from.  For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO?  How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well.  One such challenge is duplicate content when different provinces may have the same information for some pages.  Any suggestions for this?

            Technical SEO | | ey_sja
            0
          • FranFerrara

            Why do some URLs for a specific client have "/index.shtml"?

            Reviewing our client's URLs for a 301 redirect strategy, we have noticed that many URLs have "/index.shtml." The part we don'd understand is these URLs aren't the homepage and they have multiple folders followed by "/index.shtml" Does anyone happen to know why this may be occurring? Is there any SEO value in keeping the "/index.shtml" in the URL?

            Technical SEO | | FranFerrara
            0
          • TalkInThePark

            Googlebot does not obey robots.txt disallow

            Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin

            Technical SEO | | TalkInThePark
            0
          • irvingw

            Allow or Disallow First in Robots.txt

            If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page

            Technical SEO | | irvingw
            0
          • dreadmichael

            How to block "print" pages from indexing

            I have a fairly large FAQ section and every article has a "print" button. Unfortunately, this is creating a page for every article which is muddying up the index - especially on my own site using Google Custom Search. Can you recommend a way to block this from happening? Example Article: http://www.knottyboy.com/lore/idx.php/11/183/Maintenance-of-Mature-Locks-6-months-/article/How-do-I-get-sand-out-of-my-dreads.html Example "Print" page: http://www.knottyboy.com/lore/article.php?id=052&action=print

            Technical SEO | | dreadmichael
            0
          • vforvinnie

            Block Quotes and Citations for duplicate content

            I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way.  This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content?  I think it would be great for my visitors, but also to the source as I am giving them credit.  It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales.  I could also do this for product information right from the manufacturer. I want to do this for a contact lens site.  I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions.  Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method?  Is this how Rottentomatoes.com does it?  On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie

            Technical SEO | | vforvinnie
            1
          • Trigun

            How to Redirect all inactive Feed to a specific Wordpress page

            Hi Guys, I've been doing much cleaning on my blog lately and deleted numerous categories including their posts with low quality content. After deleting the categories, Google Webmaster Tools is reporting some 404 errors about the RSS Feeds for the deleted categories. I've created a 404.php file inside my theme and placed the following code header("HTTP/1.1 301 Moved Permanently");
            header("Location: http://www.mysite.com/My404Page/", true, 301);
            exit();
            ?> this have catched all 404 errors and redirected them to the specific page. Unfortunately, it could not catch the inactive feed urls.  Is there a way to do this so that all inactive feeds will be redirected to my 404 page? Thanks in advance....

            Technical SEO | | Trigun
            0

          Get started with Moz Pro!

          Unlock the power of advanced SEO tools and data-driven insights.

          Start my free trial
          Products
          • Moz Pro
          • Moz Local
          • Moz API
          • Moz Data
          • STAT
          • Product Updates
          Moz Solutions
          • SMB Solutions
          • Agency Solutions
          • Enterprise Solutions
          • Digital Marketers
          Free SEO Tools
          • Domain Authority Checker
          • Link Explorer
          • Keyword Explorer
          • Competitive Research
          • Brand Authority Checker
          • Local Citation Checker
          • MozBar Extension
          • MozCast
          Resources
          • Blog
          • SEO Learning Center
          • Help Hub
          • Beginner's Guide to SEO
          • How-to Guides
          • Moz Academy
          • API Docs
          About Moz
          • About
          • Team
          • Careers
          • Contact
          Why Moz
          • Case Studies
          • Testimonials
          Get Involved
          • Become an Affiliate
          • MozCon
          • Webinars
          • Practical Marketer Series
          • MozPod
          Connect with us

          Contact the Help team

          Join our newsletter
          Moz logo
          © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
          • Accessibility
          • Terms of Use
          • Privacy

          Looks like your connection to Moz was lost, please wait while we try to reconnect.