Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • SEO Q&A
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • Case Studies
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • SEO Q&A

        Insights & discussions from an SEO community of 500,000+.

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • Case Studies

        Explore how Moz drives ROI with a proven track record of success.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Technical SEO
    4. Is there a limit to how many URLs you can put in a robots.txt file?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Is there a limit to how many URLs you can put in a robots.txt file?

    Technical SEO
    4
    7
    2391
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • kcb8178
      kcb8178 Subscriber last edited by

      We have a site that has way too many urls caused by our crawlable faceted navigation.  We are trying to purge 90% of our urls from the indexes.  We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags.  Meanwhile we are getting hit with excessive url warnings and have been it by Panda.

      Would it help speed the process of purging urls if we added the urls to the robots.txt file?  Could this cause any issues for us?  Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.

      1 Reply Last reply Reply Quote 0
      • CraigBradford
        CraigBradford last edited by

        Hi Kristen,

        I did this recently and it worked. The important part is that you need to block the pages in robots.txt or add a noindex tag to the pages to stop them from being indexed again.

        I hope this helps.

        1 Reply Last reply Reply Quote 0
        • CraigBradford
          CraigBradford last edited by

          Hi all, Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.

          I hope this helps.

          1 Reply Last reply Reply Quote 0
          • kcb8178
            kcb8178 Subscriber @DirkC last edited by

            Great thanks for the input.  Per Kristen's post I am worried that it could just block the URLs altogether and they will never get purged from the index.

            1 Reply Last reply Reply Quote 0
            • kcb8178
              kcb8178 Subscriber last edited by

              Yes, we have done that and are seeing traction on those urls, but we can't get rid of these old urls as fast as we would like.

              Thanks for your input

              1 Reply Last reply Reply Quote 0
              • kcb8178
                kcb8178 Subscriber last edited by

                Thanks Kristen, thats what I was afraid I would do.  Other than Fetch is there a way to send Google these URLs in mass?  There are over 100 million URLs so Fetch is not scalable.  They are picking them up slowly, but at current pace it will take a few months and I would like to find a way to make it purge faster.

                1 Reply Last reply Reply Quote 0
                • DirkC
                  DirkC last edited by

                  You could add them to the robots.txt but it you have to remember that Google will only read the first 500kb (source) - as far as I understand with the number of url's you want to block you'll pass this limit.

                  As Google bot is able to understand basic regex expressions it's probably better to use regex (you will probably be able to block all these url's with a few lines of code.
                  More info here  & on Moz: https://moz.com/blog/interactive-guide-to-robots-txt

                  Dirk

                  kcb8178 1 Reply Last reply Reply Quote 1
                  • 1 / 1
                  • First post
                    Last post

                  Got a burning SEO question?

                  Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                  Start my free trial


                  Browse Questions

                  Explore more categories

                  • Moz Tools

                    Chat with the community about the Moz tools.

                  • SEO Tactics

                    Discuss the SEO process with fellow marketers

                  • Community

                    Discuss industry events, jobs, and news!

                  • Digital Marketing

                    Chat about tactics outside of SEO

                  • Research & Trends

                    Dive into research and trends in the search industry.

                  • Support

                    Connect on product support and feature requests.

                  • See all categories

                  Related Questions

                  • whiteonlySEO

                    Category URL Pagination where URLs don't change between pages

                    Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?

                    Technical SEO | | whiteonlySEO
                    0
                  • michelledemaree

                    Numbers in URL

                    Hey guys! Need your many awesome brains. 🙂 This may be a very basic question but am hoping you can help me out with some insights beyond "because Google says it's better". 🙂 I only recently started working with SEO, and I work for a SaaS website builder company that has millions of open/active user sites, and all our user sites URLs, instead of www.mydomainname.com/gallery or myusername.simplesite.com/about, we use numbers, so www.mysite.com/453112 or myusername.simplesite.com/426521 The Sales manager has asked me to figure out if it will pay off for us in terms of traffic (other benefits?) to change it from the number system to the "proper" and right way of setting up these URLs. He's looking for rather concrete answers, as he usually sits with paid search and is therefore used to the mindset of "if we do x it will yield us y in z months". I'm finding it quite difficult to find case studies/other concrete examples beyond the generic, vague implication that it will simply be "better" (when for example looking at SEO checklists and search engine guidelines). Will it make a difference? How so? I have to convince our developers of the importance and priority of this adjustment, or it will just drown in the many projects they already have. So truly, any insights would be so very welcome. Thank you!

                    Technical SEO | | michelledemaree
                    2
                  • NicDale

                    Is there any value in having a blank robots.txt file?

                    I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place.  Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?

                    Technical SEO | | NicDale
                    0
                  • TalkInThePark

                    Googlebot does not obey robots.txt disallow

                    Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin

                    Technical SEO | | TalkInThePark
                    0
                  • BistosAmerica

                    Oh no googlebot can not access my robots.txt file

                    I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.

                    Technical SEO | | BistosAmerica
                    0
                  • Wallander

                    Removing robots.txt on WordPress site problem

                    Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?

                    Technical SEO | | Wallander
                    0
                  • joshcanhelp

                    Invisible robots.txt?

                    So here's a weird one... Client comes to me for some simple changes, turns out there are some major issues with the site, one of which is that none of the correct content pages are showing up in Google, just ancillary (outdated) ones. Looks like an issue because even the main homepage isn't showing up with a "site:domain.com" So, I add to Webmaster Tools and, after an hour or so, I get the red bar of doom, "robots.txt is blocking important pages." I check it out in Webmasters and, sure enough, it's a "User agent: * Disallow /" ACK! But wait... there's no robots.txt to be found on the server. I can go to domain.com/robots.txt and see it but nothing via FTP. I upload a new one and, thankfully, that is now showing but I've never seen that before. Question is: can a robots.txt file be stored in a way that can't be seen? Thanks!

                    Technical SEO | | joshcanhelp
                    0
                  • JordanJudson

                    Should I set up a disallow in the robots.txt for catalog search results?

                    When the crawl diagnostics came back for my site its showing around 3,000 pages of duplicate content. Almost all of them are of the catalog search results page. I also did a site search on Google and they have most of the results pages in their index too. I think I should just disallow the bots in the /catalogsearch/ sub folder, but I'm not sure if this will have any negative effect?

                    Technical SEO | | JordanJudson
                    0

                  Get started with Moz Pro!

                  Unlock the power of advanced SEO tools and data-driven insights.

                  Start my free trial
                  Products
                  • Moz Pro
                  • Moz Local
                  • Moz API
                  • Moz Data
                  • STAT
                  • Product Updates
                  Moz Solutions
                  • SMB Solutions
                  • Agency Solutions
                  • Enterprise Solutions
                  Free SEO Tools
                  • Domain Authority Checker
                  • Link Explorer
                  • Keyword Explorer
                  • Competitive Research
                  • Brand Authority Checker
                  • Local Citation Checker
                  • MozBar Extension
                  • MozCast
                  Resources
                  • Blog
                  • SEO Learning Center
                  • Help Hub
                  • Beginner's Guide to SEO
                  • How-to Guides
                  • Moz Academy
                  • API Docs
                  About Moz
                  • About
                  • Team
                  • Careers
                  • Contact
                  Why Moz
                  • Case Studies
                  • Testimonials
                  Get Involved
                  • Become an Affiliate
                  • MozCon
                  • Webinars
                  • Practical Marketer Series
                  • MozPod
                  Connect with us

                  Contact the Help team

                  Join our newsletter
                  Moz logo
                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                  • Accessibility
                  • Terms of Use
                  • Privacy

                  Looks like your connection to Moz was lost, please wait while we try to reconnect.