What is the fastest way to deindex content from Google?
-
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags).
We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process?
Thanks
-
Excellent answer. Thank you very much.
-
Rosemary, in order to remove the content quickly, you have to do several things. You see, Google's processes for crawling, etc. and removing content from the index don't always happen all at once. So, it's best to do several things:
-
Remove the content. When visitors or bots visit the URL, use the "410 Gone" server header code to ensure that it's not just a 404 error being used.
-
If the content must stay and cannot be removed but still needs to be removed from Google's index, consider password protecting the content, putting it behind a paywall, making users log in to see the content, and/or adding a meta robots noindex tag on the page.
-
Add a robots.txt file on the subdomain so that it tells the bots to stop crawling. If you use something like dev.yourdomain.com for a dev section of the site, make sure that you have a robots.txt file at dev.yourdomain.com/robots.txt.
-
Use Google Search Console to remove the content. Once logged in, use the removal tool: https://www.google.com/webmasters/tools/removals?pli=1
By using several approaches, this is going to be the fastest way to remove the content.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Google Bias?
Does Google bias the type of content which ranks? HI Guys, If i wanted to create a nice blog post around a topic like: black dresses or yoga pants. If you view google.com or google.com.au results all the top ranking URLs are e-commerce pages which list the products. There is very rarely - blog content e.g. top black dresses to wear... or 7 of the hottest yoga pants on the market. The search intent is about the same i.e. someone looking for black dresses would be interested in that blog post. So in my conclusion Google has some form of bias in delivering ecommerce sites above blog/skyscrapper form of content. Thoughts? Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Did Google Ignore My Links?
Hello, I'm a little new to SEO, but I recently was featured (around 2 yrs ago) on some MAJOR tech blogs. For some reason however, my links aren't getting picked up for over 2 years - not even in MOZ, or other link checker services. - By now I should have had amazing boost from this natural building, but not sure what happened? This was completely white hat and natural links. The links were after the article was created though, would this effect things? - Please let me know if you have any advice! - Maybe I need to ping these some how or something? - Are these worthless? Thanks so much for your help! Here's some samples of the links that were naturally given to http://VaultFeed.com http://thenextweb.com/microsoft/2013/09/13/microsoft-posts-cringe-worthy-windows-phone-video-ads-mocking-apple/ http://www.theverge.com/2013/9/15/4733176/microsoft-says-pulled-iphone-parody-ads-were-off-the-mark http://www.theregister.co.uk/2013/09/16/microsoft_mocks_apple_in_vids_it_quickly_pulls/ http://www.dailymail.co.uk/sciencetech/article-2420710/Microsoft-forced-delete-cringe-worthy-spoof-videos-mocking-new-range-iPhones.html And a LOT more... Not sure if these links will never be valid, or maybe I'm doing something completely wrong? - Is there any way for Google to recognize these now, and then they'll be seen by MOZ and other sites too? I've done a LOT of searching and there's no definitive advice I've seen for links that were added after the URL was first indexed by Google.
Intermediate & Advanced SEO | | DByers0 -
Html language deprecated by Google?
Hi Mates, Currently we are using on our site two tags for language (we are targeting english ) .... and these are defined on the head section, my question is it is required by Google in order to rank well or it is deprecated. Thank you Claudio
Intermediate & Advanced SEO | | ClayRey0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0