What is the fastest way to deindex content from Google?
-
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags).
We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process?
Thanks
-
Excellent answer. Thank you very much.
-
Rosemary, in order to remove the content quickly, you have to do several things. You see, Google's processes for crawling, etc. and removing content from the index don't always happen all at once. So, it's best to do several things:
-
Remove the content. When visitors or bots visit the URL, use the "410 Gone" server header code to ensure that it's not just a 404 error being used.
-
If the content must stay and cannot be removed but still needs to be removed from Google's index, consider password protecting the content, putting it behind a paywall, making users log in to see the content, and/or adding a meta robots noindex tag on the page.
-
Add a robots.txt file on the subdomain so that it tells the bots to stop crawling. If you use something like dev.yourdomain.com for a dev section of the site, make sure that you have a robots.txt file at dev.yourdomain.com/robots.txt.
-
Use Google Search Console to remove the content. Once logged in, use the removal tool: https://www.google.com/webmasters/tools/removals?pli=1
By using several approaches, this is going to be the fastest way to remove the content.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
Best way to show content from articles I am published/featured in
Hi. I was wondering what was the best way to show my audience articles that my client is featured in. My client is specifically a surgeon, who has been referenced in many articles around his specific field of cosmetic surgery. An idea posed is to repost the entire article but just reference back to the original article. Is there an SEO friendly way of doing this? I have seen this done before, like search engine journal's author Larry Kim might repost something he wrote or published on wordstream onto search engine journal sometimes, but makes the reference that it was originally posted on wordstream. I know the standard thinking is to always just write new and unique content, but there is already a good amount written about our client and referencing his work, how can we use this to our advantage and give new or prospecting patients information regarding his credibility? Our client really does not want us to write articles for him, and he does not have the time to write them either. Again Question: How can we leverage articles and studies that have already been published online that is featuring our client and show them in full onto our own website?
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
On Page Content. has a H2 Tag but should I also use H3 tags for the sub headings within this body of content
Hi Mozzers, My on page content comes under my H2 tag. I have a few subheadings within my content to help break it up etc and currently this is just underlined (not bold or anything) and I am wondering from an SEO perspective, should I be making these sub headings H3 tags. Otherwise , I just have 500-750 words of content under an H2 tag which is what I am currently doing on my landing pages. thanks pete
Intermediate & Advanced SEO | | PeteC120 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840