Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Let your business shine with Listings AI
      Moz Local

      Let your business shine with Listings AI

      Learn more
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Creating 100,000's of pages, good or bad idea

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Creating 100,000's of pages, good or bad idea

    Intermediate & Advanced SEO
    6
    9
    2228
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • PottyScotty
      PottyScotty last edited by

      Hi Folks,

      Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites.  Should we focus on the long tail again?

      One option for us is to take every town across the UK and create pages using our activities.  e.g.

      Stirling
      Stirling paintball
      Stirling Go Karting
      Stirling Clay shooting

      We are not going to link to these pages directly from our main menus but from the site map.

      These pages would then show activities that were in a 50 mile radius of the towns.  At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to.

      With 45,000 towns and 250 activities we could create over a million pages which seems very excessive!  Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get.

      Is there a limit to how big a site should be?

      edit

      1 Reply Last reply Reply Quote 0
      • MiriamEllis
        MiriamEllis Subject Expert last edited by

        Hi Mark!

        Thanks for asking this good question. While there is no limit to how big a website can be, I think you can see from the general response here that most members would encourage you to stick to manually developing quality pages rather than automating hundreds of thousands of pages, solely for ranking purposes. I second this advice.

        Now, I would like to clarify your business model. Are you a physical, actual business that customers come to, either to buy paintball equipment or to play paintball in a gallery? Or, is your business virtual, with no in person transactions? I'm not quite understanding this from your description.

        If the former, I would certainly encourage you to develop a very strong, unique page for each of your physical locations. If you have 10 locations (with unique street addresses and phone numbers), then that would be 10 pages. If you've got 20 locations, that would be 20 pages, etc. But don't approach these with a 'just switch out the city name in the title tags' mindset. Make these pages as exceptional as possible. Tell stories, show off testimonials, share pictures and  videos, entertain, educate, inspire. These city landing pages will be intimately linked into your whole Local SEM campaign, provided they each represent a business location with a unique dedicated street address and unique local area code phone number.

        But, if you are considering simply building a page for every city in the UK, I just can't see justification for doing so. Ask yourself - what is the value?

        There are business models (such as carpet cleaners, chimney sweeps, general contractors, etc.) that go to their clients' locations to serve and for which I would be advising that they create city landing pages for each of their service cities, but this would be extremely regional...not statewide or national or International. A carpet cleaner might serve 15 different towns and cities in his region, and I would encourage him to start gathering project notes and testimonials, videos and photos to begin developing a body of content important enough for him to start creating strong, interesting and unique pages for each of these cities. But I've also had local business owners tell me they want to cover every city in California, for instance, because they think it will help them to do so, and I discourage this.

        Even if the business is virtual and doesn't have any in-person transactions with clients or physical locations, I would still discourage this blanketing-the-whole-nation-with-pages approach. A national retailer needs to build up its brand so that it becomes known and visible organically for its products rather than your theoretical approach of targeting every city in the nation. In short order, the mindset behind that approach just doesn't make good horse sense.

        And, as others have stated, adding thousands of thin, potentially duplicate pages to any site could definitely have a very negative effect on rankings.

        My advice is to make the time to start developing a content strategy for cities in which you have a legitimate presence. If budget means you can't hire a copywriter to help you with this and to speed up the work, accept that this project deserves all the time you can give it and that a slow development of exceptional pages is better than a fast automation of poor quality pages.

        Hope this helps!

        1 Reply Last reply Reply Quote 3
        • TommyTan
          TommyTan @PottyScotty last edited by

          Hi Mark,

          If A,C, and E's page is similar to B,D, and F's page it is still consider dupllicate content.  Based on Webmaster's definiton:

          "Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar"

          Each of your pages should be unique and different from other pages.

          I suggest you to continue providing quality content and target the long tail keywords.  That alone will help you generate more traffic.  Furthermore, out ranking is not a problem.  You should focus on getting to the frist page (providing quality content with long tail or regular keywords) and when you are on the first page, try to get searchers to click on your link using Title tag and Meta descriptions.

          Out ranking just means they are ranked 4th and you are ranked 5th, 6th but as long as you have a better title tag and meta description. I believe searchers will click on the more attractive results.

          1 Reply Last reply Reply Quote 1
          • EGOL
            EGOL last edited by

            Cookie cutter pages like these stopped working in Google about ten years ago.

            If you toss them up I think that your entire site will tank.

            I would go back to focusing on quality pages.

            1 Reply Last reply Reply Quote 1
            • dvduval
              dvduval @PottyScotty last edited by

              If the user experience awesome, and people are staying on your site and looking around, great. If you think the 100,000 pages will make search engines love you, machines can never provide the love users can give you. 🙂

              1 Reply Last reply Reply Quote 1
              • PottyScotty
                PottyScotty @TommyTan last edited by

                Can you mix content up from your website e.g. paintball site A, C and E on one page and B,D and F on another if the towns are close together?  What I'm not sure about is how different in % terms the content actually has to be.

                If we have less written content then the amounts of words we have to actually change would be much less.

                The challenge we have is we have build the site this time with filtering in mind, so rather than making customers navigate we allow them to be able to search which is much better in terms of getting the activities they want.  The downside is now our site does not show for the long tail as we reduced the pages massively.

                TommyTan 1 Reply Last reply Reply Quote 0
                • PottyScotty
                  PottyScotty @CMC-SD last edited by

                  so we dont have the resources if we did it manually but what would happen is the content would be different on each page as we would only show activity sites within a 50 miles radius.  And we would make certain text, h1 etc different and relate to the town.

                  Below are some examples of sites I see doing well ie number 1 using this method

                  Our content would be much better than say http://www.justpaintballDOTcoDOTuk/site_guide/Aberfeldy.htm or http://www.goballisticDOTcoDOTuk/paintball_in_/ABERFELDY.asp

                  But as you say getting this wrong is my worry.

                  dvduval 1 Reply Last reply Reply Quote 0
                  • TommyTan
                    TommyTan last edited by

                    Hi Mark,

                    Creating 100,000 pages is definitely good for Search Engine because you have a lot more contents for them to crawl and have more chances your pages might show up on related keywords.  However, the problem is do you have enough unique contents you can post on all those 100,000 pages.  If you use similar content, I am afraid it will be duplicate contents.  You may think changing up the town names will be enough but it might be risky.

                    If you can create 100,000 unique contents, Sure go ahead.  If not, don't take the risk of duplicate contents.

                    PottyScotty 1 Reply Last reply Reply Quote 4
                    • CMC-SD
                      CMC-SD last edited by

                      Do you have the resources to create unique content for all those pages? Because adding 500,000 pages of duplicate content will absolutely damage your site.

                      PottyScotty 1 Reply Last reply Reply Quote 3
                      • 1 / 1
                      • First post
                        Last post

                      Got a burning SEO question?

                      Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                      Start my free trial


                      Browse Questions

                      Explore more categories

                      • Moz Tools

                        Chat with the community about the Moz tools.

                      • SEO Tactics

                        Discuss the SEO process with fellow marketers

                      • Community

                        Discuss industry events, jobs, and news!

                      • Digital Marketing

                        Chat about tactics outside of SEO

                      • Research & Trends

                        Dive into research and trends in the search industry.

                      • Support

                        Connect on product support and feature requests.

                      • See all categories

                      Related Questions

                      • andyheath

                        Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google

                        I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.

                        Intermediate & Advanced SEO | | andyheath
                        0
                      • kevinliao

                        What happens to a domain in SERPs when it's set to redirect to another?

                        We have just acquired a competing website and are wondering whether to leave it running as is for now, or set the domain to redirect to our own site. If we set up this redirect, what would happen to the old site in Google SERPs? Would the site drop off from results? If so, would we capture this new search traffic or is it a free for all and all sites compete for the search traffic as normal? Thanks in advance. Paul

                        Intermediate & Advanced SEO | | kevinliao
                        0
                      • Gauge123

                        Should you allow an auto dealer's inventory to be indexed?

                        Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!

                        Intermediate & Advanced SEO | | Gauge123
                        0
                      • Dillman

                        How do I get rel='canonical' to eliminate the trailing slash on my home page??

                        I have been searching high and low. Please help if you can, and thank you if you spend the time reading this. I think this issue may be affecting most pages. SUMMARY: I want to eliminate the trailing slash that is appended to my website. SPECIFIC ISSUE: I want www.threewaystoharems.com to showing up to users and search engines without the trailing slash but try as I might it shows up like www.threewaystoharems.com/ which is the canonical link. WHY?  and I'm concerned my back-links to the link without the trailing slash will not be recognized but most people are going to backlink me without a trailing slash. I don't want to loose linkjuice from the people and the search engines not being in consensus about what my page address is. THINGS I"VE TRIED: (1) I've gone in my wordpress settings under permalinks and tried to specify no trailing slash. I can do this here but not for the home page. (2) I've tried using the SEO by yoast to set the canonical page. This would work if I had a static front page, but my front page is of blog posts and so there is no advanced page settings to set the canonical tag. (3) I'd like to just find the source code of the home page, but because it is CSS, I don't know where to find the reference.  I have gone into the css files of my wordpress theme looking in header and index and everywhere else looking for a specification of what the canonical page is. I am not able to find it. I'm thinking it is actually specified in the .htaccess file. (4) Went into cpanel file manager looking for files that contain Canonical. I only found a file called canonical.php . the only thing that seemed like it was worth changing was changing line 139 from $redirect_url = home_url('/');  to $redirect_url = home_url('');       nothing happened. I'm thinking it is actually specified in the .htaccess file. (5) I have gone through the .htaccess file and put thes 4 lines at the top (didn't redirect or create the proper canonical link) and then at the bottom of the file  (also didn't redirect or create the proper canonical link) :   RewriteEngine on
                        RewriteCond %{HTTP_HOST} ^([a-z.]+)?threewaystoharems.com$ [NC]
                        RewriteCond %{HTTP_HOST} !^www. [NC]
                        RewriteRule .? http://www.%1threewaystoharems.com%{REQUEST_URI} [R=301,L] Please help friends.

                        Intermediate & Advanced SEO | | Dillman
                        0
                      • esiow2013

                        May know what's the meaning of these parameters in .htaccess?

                        Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                        RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                        RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                        Intermediate & Advanced SEO | | esiow2013
                        1
                      • nicole.healthline

                        Soft 404's from pages blocked by robots.txt -- cause for concern?

                        We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?

                        Intermediate & Advanced SEO | | nicole.healthline
                        4
                      • digisavvy

                        There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?

                        Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.

                        Intermediate & Advanced SEO | | digisavvy
                        0
                      • JamesO

                        To subnav or NOT to subnav... that's my question.... :)

                        We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
                          >> 2 Round Packages
                          >> 3 Round Packages
                          >> 4 Round Packages
                          >> 5 Round Packages GOLF COURSES
                          >> North End Courses
                          >> Central Courses
                          >> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful?  Should we just create a page for GOLF PACKAGES and break it down on that page?

                        Intermediate & Advanced SEO | | JamesO
                        0

                      Get started with Moz Pro!

                      Unlock the power of advanced SEO tools and data-driven insights.

                      Start my free trial
                      Products
                      • Moz Pro
                      • Moz Local
                      • Moz API
                      • Moz Data
                      • STAT
                      • Product Updates
                      Moz Solutions
                      • SMB Solutions
                      • Agency Solutions
                      • Enterprise Solutions
                      • Digital Marketers
                      Free SEO Tools
                      • Domain Authority Checker
                      • Link Explorer
                      • Keyword Explorer
                      • Competitive Research
                      • Brand Authority Checker
                      • Local Citation Checker
                      • MozBar Extension
                      • MozCast
                      Resources
                      • Blog
                      • SEO Learning Center
                      • Help Hub
                      • Beginner's Guide to SEO
                      • How-to Guides
                      • Moz Academy
                      • API Docs
                      About Moz
                      • About
                      • Team
                      • Careers
                      • Contact
                      Why Moz
                      • Case Studies
                      • Testimonials
                      Get Involved
                      • Become an Affiliate
                      • MozCon
                      • Webinars
                      • Practical Marketer Series
                      • MozPod
                      Connect with us

                      Contact the Help team

                      Join our newsletter
                      Moz logo
                      © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                      • Accessibility
                      • Terms of Use
                      • Privacy

                      Looks like your connection to Moz was lost, please wait while we try to reconnect.