Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Technical SEO
    4. Using the Google Remove URL Tool to remove https pages

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Using the Google Remove URL Tool to remove https pages

    Technical SEO
    2
    3
    6682
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • sparrowdog
      sparrowdog last edited by

      I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week.

      I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front.

      For example, I add to the removal tool:-

      https://www.mydomain.com/blah.html?search_garbage_url_addition

      On the confirmation page, the URL actually shows as:-

      http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition

      I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look?

      AND PART 2 OF MY QUESTION

      If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request?

      www.domain.com/url.html?xsearch_...

      A description for this result is not available because of this site's robots.txt – learn more.

      1 Reply Last reply Reply Quote 1
      • sparrowdog
        sparrowdog @TomRayner last edited by

        Thanks so much for taking the time to respond.

        I think I will add the https to WMT and remove them that way.

        I will take a look through the .htaccess file and the creation of the ssl robots file. A while back, it seemed that Google was indexing a lot of my site as https and then the dropped it and went mainly back to http. I will get that sorted to make it clear.

        1 Reply Last reply Reply Quote 0
        • TomRayner
          TomRayner last edited by

          Hi there

          I'll start with question 2 first as it's a bit easier to answer.  Robots.txt blocks the crawling of a page, but not necessarily indexing.  Of course, if the page cannot be crawled it will be deindexed eventually anyway, but if you're getting that description for one of your URLs, Google has not been able to access it and will stop trying to.  So that is usually enough, although if you want to remove it as well, you can by all means.

          For question 1 - GWT is a bit awkward in the sense that it treats http and https versions of your site as different webmaster properties.  Furthermore, if you want to remove a URL on your site, it will always prefix it with the http/https version of your site, no matter how you enter it.

          If you added another WMT property that was https://www.yourdomain.com - you would be able to manage that domain as well and thus you would be able to remove any URLs under that prefix.

          Incidentally, if you want to block all HTTPS pages from being accessed, you can do that with a special instruction in your htaccess file and robots txt.  You can instruct the Googlebot and other bots to read a specific robots.txt file if they visit an HTTPS URL.  To do that, you would first add this to your htaccess file:

          RewriteCond %{HTTPS} ^on$
          RewriteCond %{REQUEST_URI} ^/robots.txt$
          RewriteRule ^(.*)$ /robots_ssl.txt [L]

          This command basically says "if the URL has https, read the robots_ssl.txt file".  You then upload a file called robots_ssl.txt to your root domain.  In the txt file you just add:

          User-agent: *
          Disallow: /

          So now, if a bot reaches an https URL, it has to read the robots_ssl.txt file and upon reading that, they are denied access.  That would prevent all of your https URLs from being indexed.

          That might be useful to you, but if you go ahead and use it please take care to backup all your files in case anything goes wrong - your htaccess file is very important!

          sparrowdog 1 Reply Last reply Reply Quote 1
          • 1 / 1
          • First post
            Last post

          Got a burning SEO question?

          Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


          Start my free trial


          Browse Questions

          Explore more categories

          • Moz Tools

            Chat with the community about the Moz tools.

          • SEO Tactics

            Discuss the SEO process with fellow marketers

          • Community

            Discuss industry events, jobs, and news!

          • Digital Marketing

            Chat about tactics outside of SEO

          • Research & Trends

            Dive into research and trends in the search industry.

          • Support

            Connect on product support and feature requests.

          • See all categories

          Related Questions

          • vikasnwu

            Using 410 To Remove URLs Starting With Same Word

            We had a spam injection a few months ago.  We successfully cleaned up the site and resubmitted to google.  I recently received a notification showing a spike in 404 errors. All of the URLS have a common word at the beginning injected via the spam: sitename.com/mono
            sitename.com/mono.php?buy-good-essays
            sitename.com/mono.php?professional-paper-writer There's about 100 total URLS with the same syntax with the word "mono" in them.  Based on my research, it seems that it would be best to serve a 410.  I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/

            Technical SEO | | vikasnwu
            0
          • netzkern_AG

            Does Google index internal anchors as separate pages?

            Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico

            Technical SEO | | netzkern_AG
            0
          • DutchG

            Removed Product page on our website, what to do

            We just removed an entire product category on our website, (product pages still exist, but will be removed soon as well) Should we be setting up re-directs, or can we simply delete this category and product 
            pages and do nothing? We just received this in Google Webmasters tools: Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. We have not updated the sitemap yet...Would this be enough to do or should we do more? You can view our website here: http://tinyurl.com/6la8 We removed the entire "Spring Planted Category"

            Technical SEO | | DutchG
            0
          • timfrick

            Tool to Generate All the URLs on a Domain

            Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.

            Technical SEO | | timfrick
            0
          • JohnHuynh

            How to remove all sandbox test site link indexed by google?

            When develop site, I have a test domain is sandbox.abc.com, this site contents are same as abc.com. But, now I search site:sandbox.abc.com and aware of content duplicate with main site abc.com My question is how to remove all this link from goolge. p/s: I have just add robots.txt to sandbox and disallow all pages. Thanks,

            Technical SEO | | JohnHuynh
            0
          • agencycentral

            Noindex vs. page removal - Panda recovery

            I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂

            Technical SEO | | agencycentral
            0
          • FastLearner

            Does google use the wayback machine to determine the age of a site?

            I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks

            Technical SEO | | FastLearner
            0
          • Red_Mud_Rookie

            How to use overlays without getting a Google penalty

            One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.

            Technical SEO | | Red_Mud_Rookie
            0

          Get started with Moz Pro!

          Unlock the power of advanced SEO tools and data-driven insights.

          Start my free trial
          Products
          • Moz Pro
          • Moz Local
          • Moz API
          • Moz Data
          • STAT
          • Product Updates
          Moz Solutions
          • SMB Solutions
          • Agency Solutions
          • Enterprise Solutions
          • Digital Marketers
          Free SEO Tools
          • Domain Authority Checker
          • Link Explorer
          • Keyword Explorer
          • Competitive Research
          • Brand Authority Checker
          • Local Citation Checker
          • MozBar Extension
          • MozCast
          Resources
          • Blog
          • SEO Learning Center
          • Help Hub
          • Beginner's Guide to SEO
          • How-to Guides
          • Moz Academy
          • API Docs
          About Moz
          • About
          • Team
          • Careers
          • Contact
          Why Moz
          • Case Studies
          • Testimonials
          Get Involved
          • Become an Affiliate
          • MozCon
          • Webinars
          • Practical Marketer Series
          • MozPod
          Connect with us

          Contact the Help team

          Join our newsletter
          Moz logo
          © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
          • Accessibility
          • Terms of Use
          • Privacy

          Looks like your connection to Moz was lost, please wait while we try to reconnect.