Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Partial Match or RegEx in Search Console's URL Parameters Tool?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Partial Match or RegEx in Search Console's URL Parameters Tool?

    Intermediate & Advanced SEO
    4
    15
    3273
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • Ria_
      Ria_ last edited by

      So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them.

      Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789=

      All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe?

      Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?

      1 Reply Last reply Reply Quote 0
      • Andy.Drinkwater
        Andy.Drinkwater @Ria_ last edited by

        No problem 🙂

        Hope you get it sorted!

        -Andy

        1 Reply Last reply Reply Quote 0
        • Ria_
          Ria_ @DirkC last edited by

          Thank you! 😄

          1 Reply Last reply Reply Quote 0
          • Ria_
            Ria_ @Andy.Drinkwater last edited by

            Haha, I think the train passed the station on that one. I would have realised eventually... XD

            Thanks for your help!

            Andy.Drinkwater 1 Reply Last reply Reply Quote 1
            • DirkC
              DirkC last edited by

              Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.

              User-agent: Googlebot

              #disallowing page.php and any parameters after it

              disallow: /page.php

              #but leaving anything that starts with par1=ABC

              allow: page.php?par1=ABC

              Dirk

              Ria_ 1 Reply Last reply Reply Quote 1
              • Andy.Drinkwater
                Andy.Drinkwater @Ria_ last edited by

                Ah sorry I missed that bit!

                -Andy

                1 Reply Last reply Reply Quote 0
                • Andy.Drinkwater
                  Andy.Drinkwater @Ria_ last edited by

                  Disallowing them would be my first priority really, before removing from index.

                  The trouble with this is that if you disallow first, Google won't be able to crawl the page to act on the noindex. If you add a noindex flag, Google won't index them the next time it comes-a-crawling and then you will be good to disallow 🙂

                  I'm not actually sure of the best way for you to get the noindex in to the page header of those pages though.

                  -Andy

                  Ria_ 1 Reply Last reply Reply Quote 0
                  • Ria_
                    Ria_ @Andy.Drinkwater last edited by

                    Yep, have done. (Briefly mentioned in my previous response.) Doesn't pass 😞

                    Andy.Drinkwater 1 Reply Last reply Reply Quote 0
                    • Ria_
                      Ria_ @Martijn_Scheijbeler last edited by

                      I thought so too, but according to Google the trailing wildcard is completely unnecessary, and only needs to be used mid-URL.

                      1 Reply Last reply Reply Quote 0
                      • Ria_
                        Ria_ @Andy.Drinkwater last edited by

                        Hi Andy,

                        Disallowing them would be my first priority really, before removing from index. Didn't want to remove them before I've blocked Google from crawling them in case they get added back again next time Google comes a-crawling, as has happened before when I've simply removed a URL here and there. Does that make sense or am I getting myself mixed up here?

                        My other hack of a solution would be to check the URL in the page.php, and if URL includes par1=ABC then insert noindex meta tag. (Not sure if that would work well or not...)

                        Andy.Drinkwater 1 Reply Last reply Reply Quote 0
                        • Martijn_Scheijbeler
                          Martijn_Scheijbeler @Ria_ last edited by

                          My guess would be that this line needs an * at the end.
                          Allow: /page.php?par1=ABC*

                          Ria_ 1 Reply Last reply Reply Quote 0
                          • Andy.Drinkwater
                            Andy.Drinkwater @Ria_ last edited by

                            Sorry Martijn, just to jump in here for a second - Ria, you can test this via the Robots.txt testing tool in search console before going live to make sure it work.

                            -Andy

                            Ria_ 1 Reply Last reply Reply Quote 0
                            • Ria_
                              Ria_ @Martijn_Scheijbeler last edited by

                              Hi Martijn, thanks for your response!

                              I'm currently looking at something like this...

                              **user-agent: *** #disallowing page.php and any parameters after it
                              disallow: /page.php #but leaving anything that starts with par1=ABC
                              allow: /page.php?par1=ABC

                              I would have thought that you could disallow things broadly like that and give an exception, as you can with files in disallowed folders. But it's not passing Google's robots.txt Tester.

                              One thing that's probably worth mentioning really is that there are only two variables that I want to allow of the par1 parameter. For example's sake, ABC123 and ABC456. So would need to be either a partial match or "this or that" kinda deal, disallowing everything else.

                              Andy.Drinkwater Martijn_Scheijbeler 2 Replies Last reply Reply Quote 0
                              • Andy.Drinkwater
                                Andy.Drinkwater last edited by

                                Hi Ria,

                                I have never tried regular expressions in this way, so I can't tell you if this would work or not.

                                However, If all 1000 of these URL's are already indexed, just disallowing access won't then remove them from Google. You would ideally be able to place a noindex tag on those pages and let Google act on them, then you will be good to disallow. I am pretty sure there is no option to noindex under the URL Parameter Tool.

                                I hope that makes sense?

                                -Andy

                                Ria_ 1 Reply Last reply Reply Quote 0
                                • Martijn_Scheijbeler
                                  Martijn_Scheijbeler last edited by

                                  Hi Ria,

                                  What you could do, but it also depends on the rest of your structure is Disallow these urls based on the parameters (what you could do in a worst case scenario is that you would disallow all URLs and then put an exception Allow in there as well to make sure you still have the right URLs being indexed).

                                  Martijn.

                                  Ria_ 1 Reply Last reply Reply Quote 0
                                  • 1 / 1
                                  • First post
                                    Last post

                                  Got a burning SEO question?

                                  Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                  Start my free trial


                                  Browse Questions

                                  Explore more categories

                                  • Moz Tools

                                    Chat with the community about the Moz tools.

                                  • SEO Tactics

                                    Discuss the SEO process with fellow marketers

                                  • Community

                                    Discuss industry events, jobs, and news!

                                  • Digital Marketing

                                    Chat about tactics outside of SEO

                                  • Research & Trends

                                    Dive into research and trends in the search industry.

                                  • Support

                                    Connect on product support and feature requests.

                                  • See all categories

                                  Related Questions

                                  • Atlanta-SMO

                                    Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's

                                    An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.

                                    Intermediate & Advanced SEO | | Atlanta-SMO
                                    0
                                  • 94501

                                    How can I get a list of every url of a site in Google's index?

                                    I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike

                                    Intermediate & Advanced SEO | | 94501
                                    0
                                  • rhoadesjohn

                                    Why is my client's site not ranking anymore? Like big time!

                                    Ok, I'm reaching out to all of you Moz'rs for some help with this one. My client's site has dropped off the face of google in a real short period of time.  It went from page 1 (avg rank 3 to page 6 (avg rank 50) and below in the matter of 2 weeks. Here's some facts: 1.  DA is a 22 and homepage PA is a 31.  It outranks all other sites in its competitive set. 2.  The homepage used to be the page that displays for keyword searches, now its the FAQ page, which has a lower PA of 23. Why has the home page seemingly vaporized?  And, why is the FAQ showing as the first result? What should I start checking.  I feel paralyzed, not sure where to start. More info: a.  There are no alerts present in Webmaster Tools. b.  For some reason the homepage (domain.com) was 301'd to domain.com/home.html.  Domain.com is indexed by Google, however, domain.com/home.html is not.  If this is the issue, what is the best way to handle it? Thanks in advance for your help!

                                    Intermediate & Advanced SEO | | rhoadesjohn
                                    1
                                  • esiow2013

                                    May know what's the meaning of these parameters in .htaccess?

                                    Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                                    RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                                    RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                                    Intermediate & Advanced SEO | | esiow2013
                                    1
                                  • Omnipress

                                    Do I need to use canonicals if I will be using 301's?

                                    I just took a job about three months and one of the first things I wanted to do was restructure the site. The current structure is solution based but I am moving it toward a product focus. The problem I'm having is the CMS I'm using isn't the greatest (and yes I've brought this up to my CMS provider). It creates multiple URL's for the same page. For example, these two urls are the same page: (note: these aren't the actual urls, I just made them up for demonstration purposes) http://www.website.com/home/meet-us/team-leaders/boss-man/
                                    http://www.website.com/home/meet-us/team-leaders/boss-man/bossman.cmsx (I know this is terrible, and once our contract is up we'll be looking at a different provider) So clearly I need to set up canonical tags for the last two pages that look like this: http://www.omnipress.com/boss-man" /> With the new site restructure, do I need to put a canonical tag on the second page to tell the search engine that it's the same as the first, since I'll be changing the category it's in? For Example: http://www.website.com/home/meet-us/team-leaders/boss-man/ will become http://www.website.com/home/MEET-OUR-TEAM/team-leaders/boss-man My overall question is, do I need to spend the time to run through our entire site and do canonical tags AND 301 redirects to the new page, or can I just simply redirect both of them to the new page? I hope this makes sense. Your help is greatly appreciated!!

                                    Intermediate & Advanced SEO | | Omnipress
                                    0
                                  • Townpages

                                    Culling 99% of a website's pages. Will this cause irreparable damage?

                                    I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick

                                    Intermediate & Advanced SEO | | Townpages
                                    0
                                  • nicole.healthline

                                    Tool to calculate the number of pages in Google's index?

                                    When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?

                                    Intermediate & Advanced SEO | | nicole.healthline
                                    0
                                  • JamesO

                                    To subnav or NOT to subnav... that's my question.... :)

                                    We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
                                      >> 2 Round Packages
                                      >> 3 Round Packages
                                      >> 4 Round Packages
                                      >> 5 Round Packages GOLF COURSES
                                      >> North End Courses
                                      >> Central Courses
                                      >> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful?  Should we just create a page for GOLF PACKAGES and break it down on that page?

                                    Intermediate & Advanced SEO | | JamesO
                                    0

                                  Get started with Moz Pro!

                                  Unlock the power of advanced SEO tools and data-driven insights.

                                  Start my free trial
                                  Products
                                  • Moz Pro
                                  • Moz Local
                                  • Moz API
                                  • Moz Data
                                  • STAT
                                  • Product Updates
                                  Moz Solutions
                                  • SMB Solutions
                                  • Agency Solutions
                                  • Enterprise Solutions
                                  • Digital Marketers
                                  Free SEO Tools
                                  • Domain Authority Checker
                                  • Link Explorer
                                  • Keyword Explorer
                                  • Competitive Research
                                  • Brand Authority Checker
                                  • Local Citation Checker
                                  • MozBar Extension
                                  • MozCast
                                  Resources
                                  • Blog
                                  • SEO Learning Center
                                  • Help Hub
                                  • Beginner's Guide to SEO
                                  • How-to Guides
                                  • Moz Academy
                                  • API Docs
                                  About Moz
                                  • About
                                  • Team
                                  • Careers
                                  • Contact
                                  Why Moz
                                  • Case Studies
                                  • Testimonials
                                  Get Involved
                                  • Become an Affiliate
                                  • MozCon
                                  • Webinars
                                  • Practical Marketer Series
                                  • MozPod
                                  Connect with us

                                  Contact the Help team

                                  Join our newsletter
                                  Moz logo
                                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                  • Accessibility
                                  • Terms of Use
                                  • Privacy

                                  Looks like your connection to Moz was lost, please wait while we try to reconnect.