Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Access 20 years of data with flexible pricing
      Moz API

      Access 20 years of data with flexible pricing

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. What's the best way to noindex pages but still keep backlinks equity?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    What's the best way to noindex pages but still keep backlinks equity?

    Intermediate & Advanced SEO
    2
    5
    2151
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • fablau
      fablau last edited by

      Hello everyone,

      Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages?

      For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page?

      The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?

      1 Reply Last reply Reply Quote 3
      • fablau
        fablau @ChrisAshton last edited by

        Thank you Chris for your in-depth answer, you just confirmed what I suspected.

        To clarify though, what I am trying to save here by noindexing those subsequent pages is "indexing budget" not "crawl budget". You know the famous "indexing cap"? And also, tackling possible "duplicate" or "thin" content issues with such "similar but different" pages... fact is, our website has been hit by Panda several times, we recovered several times as well, but we have been hit again with the latest quality update of last June, and we are trying to find a way to get out of it once for all. Hence my attempt to reduce the number of similar indexed pages as much as we can.

        I have just opened a discussion on this "Panda-non-sense" issue, and I'd like to know your opinion about it:

        https://moz.com/community/q/panda-rankings-and-other-non-sense-issues

        Thank you again.

        1 Reply Last reply Reply Quote 0
        • ChrisAshton
          ChrisAshton @fablau last edited by

          Hi Fabrizo,

          That's a tricky one given the sheer volume of pages/music on the site. Typically the cleanest way to handle all of this is to offer up a View All page and Canonical back to that but in your case, a View All pages would scroll on forever!

          Canonical is not the answer here. It's made for handling duplicate pages like this:

          www.website.com/product1.html
               www.website.com/product1.html&sid=12432

          In this instance, both pages are 100% identical so the canonical tag tells Google that any variation of product1.html is actually just that page and should be counted as such. What you've got here is pagination so while the pages are mostly the same, they're not identical.

          Instead, this is exactly what rel=prev/next is for which you've already looked into. It's very hard to find recent information on this topic but the traditional advice from Google has been to implement prev/next and they will infer the most important page (typically page one) from the fact that it's the only page that has a rel=next but no rel=prev (because there is no previous page). Apologies if you already knew all of this; just making sure I didn't skim over anything here. Google also says these pages will essentially be seen as a single unit from that point and so all link equity will be consolidated toward that block of pages.

          Canonical and rel=next/prev do act separately so by all means if you have search filters or anything else that may alter the URL, a canonical tag can be used as well but each page here would just point back to itself, not back to page 1.

          This clip from Google's Maile Ohye is quite old but the advice in here clears a few things up and is still very relevant today.

          With that said, the other point you raised is very valid - what to do about crawl budget. Google also suggests just leaving them as-is since you're only linking to the first 5 pages and any links beyond that are buried so deep in the hierarchy they're seen as a low priority and will barely be looked at.

          From my understanding (though I'm a little hesitant on this one) is that noindexed pages do retain their link equity. Noindex doesn't say 'don't crawl me' (also meaning it won't help your crawl budget, this would have to be done through Robots.txt), it says 'don't include me in your index'. So on this logic it would make sense that links pointing to a noindexed page would still be counted.

          fablau 1 Reply Last reply Reply Quote 2
          • fablau
            fablau @ChrisAshton last edited by

            You are right, hard to give advice without the specific context.

            Well, here is the problem that I am facing: we have an e-commerce website and each category has several hundreds if not thousands of pages... now, I want just the first page of each category page to appear in the index in order to not waste the index cap and avoid possible duplicate issues, therefore I want to noindex all subsequent pages, and index just the first page (which is also the most rich).

            Here is an example from our website, our piano sheet music category page:

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html

            I want that first page to be in the index, but not the subsequent ones:

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=2

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=3

            etc...

            After playing with canonicals and rel,next, I have realized that Google still keeps those unuseful pages in the index, whereas by removing them could help with both index cap issues and possible Panda penalties (too many similar and not useful pages). But is there any way to keep any possible link-equity of those subsequent pages by noindexing them? Or maybe the link equity is anyway preserved on those pages and on the overall domain as well? And, better, is there a way to move all that possible link equity to the first page in some way?

            I hope this makes sense. Thank you for your help!

            ChrisAshton 1 Reply Last reply Reply Quote 0
            • ChrisAshton
              ChrisAshton last edited by

              Apologies for the indirect answer but I would have to ask "why"?

              If these pages are almost identical and you only want one of them to be indexed, in most situations the users would probably benefit from there only being that one main page. Cutting down on redundant pages is great for UX, crawl budget and general site quality.

              Maybe there is a genuine reason for it but without knowing the context it's hard to give accurate info on the best way to handle it 🙂

              fablau 1 Reply Last reply Reply Quote 0
              • 1 / 1
              • First post
                Last post

              Got a burning SEO question?

              Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


              Start my free trial


              Browse Questions

              Explore more categories

              • Moz Tools

                Chat with the community about the Moz tools.

              • SEO Tactics

                Discuss the SEO process with fellow marketers

              • Community

                Discuss industry events, jobs, and news!

              • Digital Marketing

                Chat about tactics outside of SEO

              • Research & Trends

                Dive into research and trends in the search industry.

              • Support

                Connect on product support and feature requests.

              • See all categories

              Related Questions

              • Leowa

                Multiple Markups on The Same Page - Best Solution?

                Hi there! I have a website that is build in react javascript, and I'm trying to use markup on my pages. They are mostly articles about general topics with common questions (about the topic), and for most articles I would like to use two markups: article markup + FAQ Markup ( for the questions in the article) article markup + how-to markup Can I do this or will Google get confused? Since I have two @type at the same time, for example @type": "FAQPage" and "@type": "Article". How should I think? I'm using https://schema.dev/  right now. Thanks!

                Intermediate & Advanced SEO | | Leowa
                0
              • McTaggart

                Sitemaps during a migration - which is the best way of dealing with them?

                Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke

                Intermediate & Advanced SEO | | McTaggart
                0
              • Philip-DiPatrizio

                Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?

                If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage?  Nothing?  Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?

                Intermediate & Advanced SEO | | Philip-DiPatrizio
                0
              • Dillman

                How do I get rel='canonical' to eliminate the trailing slash on my home page??

                I have been searching high and low. Please help if you can, and thank you if you spend the time reading this. I think this issue may be affecting most pages. SUMMARY: I want to eliminate the trailing slash that is appended to my website. SPECIFIC ISSUE: I want www.threewaystoharems.com to showing up to users and search engines without the trailing slash but try as I might it shows up like www.threewaystoharems.com/ which is the canonical link. WHY?  and I'm concerned my back-links to the link without the trailing slash will not be recognized but most people are going to backlink me without a trailing slash. I don't want to loose linkjuice from the people and the search engines not being in consensus about what my page address is. THINGS I"VE TRIED: (1) I've gone in my wordpress settings under permalinks and tried to specify no trailing slash. I can do this here but not for the home page. (2) I've tried using the SEO by yoast to set the canonical page. This would work if I had a static front page, but my front page is of blog posts and so there is no advanced page settings to set the canonical tag. (3) I'd like to just find the source code of the home page, but because it is CSS, I don't know where to find the reference.  I have gone into the css files of my wordpress theme looking in header and index and everywhere else looking for a specification of what the canonical page is. I am not able to find it. I'm thinking it is actually specified in the .htaccess file. (4) Went into cpanel file manager looking for files that contain Canonical. I only found a file called canonical.php . the only thing that seemed like it was worth changing was changing line 139 from $redirect_url = home_url('/');  to $redirect_url = home_url('');       nothing happened. I'm thinking it is actually specified in the .htaccess file. (5) I have gone through the .htaccess file and put thes 4 lines at the top (didn't redirect or create the proper canonical link) and then at the bottom of the file  (also didn't redirect or create the proper canonical link) :   RewriteEngine on
                RewriteCond %{HTTP_HOST} ^([a-z.]+)?threewaystoharems.com$ [NC]
                RewriteCond %{HTTP_HOST} !^www. [NC]
                RewriteRule .? http://www.%1threewaystoharems.com%{REQUEST_URI} [R=301,L] Please help friends.

                Intermediate & Advanced SEO | | Dillman
                0
              • esiow2013

                May know what's the meaning of these parameters in .htaccess?

                Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                Intermediate & Advanced SEO | | esiow2013
                1
              • richardo24hr

                Best possible linking on site with 100K indexed pages

                Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.

                Intermediate & Advanced SEO | | richardo24hr
                0
              • MTalhaImtiaz

                How to check a website's architecture?

                Hello everyone, I am an SEO analyst - a good one - but I am weak in technical aspects. I do not know any programming and only a little HTML. I know this is a major weakness for an SEO so my first request to you all is to guide me how to learn HTML and some basic PHP programming. Secondly... about the topic of this particular question - I know that a website should have a flat architecture... but I do not know how to find out if a website's architecture is flat or not, good or bad. Please help me out on this... I would be obliged. Eagerly awaiting your responses, BEst Regards, Talha

                Intermediate & Advanced SEO | | MTalhaImtiaz
                0
              • Townpages

                Culling 99% of a website's pages. Will this cause irreparable damage?

                I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick

                Intermediate & Advanced SEO | | Townpages
                0

              Get started with Moz Pro!

              Unlock the power of advanced SEO tools and data-driven insights.

              Start my free trial
              Products
              • Moz Pro
              • Moz Local
              • Moz API
              • Moz Data
              • STAT
              • Product Updates
              Moz Solutions
              • SMB Solutions
              • Agency Solutions
              • Enterprise Solutions
              • Digital Marketers
              Free SEO Tools
              • Domain Authority Checker
              • Link Explorer
              • Keyword Explorer
              • Competitive Research
              • Brand Authority Checker
              • Local Citation Checker
              • MozBar Extension
              • MozCast
              Resources
              • Blog
              • SEO Learning Center
              • Help Hub
              • Beginner's Guide to SEO
              • How-to Guides
              • Moz Academy
              • API Docs
              About Moz
              • About
              • Team
              • Careers
              • Contact
              Why Moz
              • Case Studies
              • Testimonials
              Get Involved
              • Become an Affiliate
              • MozCon
              • Webinars
              • Practical Marketer Series
              • MozPod
              Connect with us

              Contact the Help team

              Join our newsletter
              Moz logo
              © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
              • Accessibility
              • Terms of Use
              • Privacy

              Looks like your connection to Moz was lost, please wait while we try to reconnect.