How to remove an entire site from Google?
-
Hi people,
I have a site with around 2.000 urls indexed in google, and 10 subdomains indexed too, which I want to remove entirely, to set up a new web.
Which is the best way to do it?
Regards!
-
Hmm, I don't quite understand. If you bought the website, wouldn't you want to take it's authority and redirect it to your primary site?
Re: subdomains > You can use the same WMT account to add and verify the subdomains.
Re: Noindex tag > This all depends on how often Google crawls the site, there is no minimum of maximum amount of time it will take. I'd say that one year is an edge case, and there were other factors at play (i.e. the noindex pages were orphaned).
-
Hi DAVID,
Thanks for you fast & complete response!
In reply to your question, this webpage was of another tematic, and we bought the domain, but all the information is different.
To remove the subdomains, I have to create a webmaster account for each subdomain?
I been researching and found that the "No Index" tag, sometimes can take more than a year for google to remove the url from the results, you know if that is true?
Thanks again!
Cheers,
Exequiel
-
The quickest way to remove an entire site is:
A: Block everything in robots.txt - add this line into your file:
User-agent: *
Disallow: /
**B:** Set up and verify webmaster tools, then go to your dashboard, select Optimization
(left menu) > Remove URLs > Create new removal request.
When it asks you to enter a URL, just specify "/" (without quotes to signal the root).
Note! You'll have to repeat the verification / removal process for all of the subdomains as well
That *should* knock out your site within a few days.
In the unlikely event that doesn't work, do this:
**A:** Remove the robots.txt block
B: Add the meta robots NOINDEX tag to each page
**C:** Once the pages are completely gone (use site:example.com to check),
put the robots.txt block back on.
**Question for you:** Why exactly are you doing this? There might be a better
solution for added SEO benefit if you explain why...
Cheers,
Dave
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
How to know if your site has been penalized by Google
Hello, One of my clients ranking drop dramatically.
Intermediate & Advanced SEO | | ogdcorp
We believe it was due to an upgrade to his site. While the site was live www.clientdomain.com
Work was being done on the new site www.clientdomain.com/new (1 month) I think google crawled the /new link and took as a content duplication since both sites had the same content. Is there a MOZ tool to see if a site has been penalized or any online tool? Thanks0 -
Spam Links? -115 Domains Sharing the Same IP Address, to Remove or Not Remove Links
Out of 250 domains that link to my site about 115 are from low quality directories that are published by the same company and hosted on the same ip address. Examples of these directories are: -www.keydirectory.net -www.linkwind.com -www.sitepassage.com -www.ubdaily.com -www.linkyard.org A recent site audit from a reputable SEO firm identified 125 toxic links. I assume these are those toxic links. They also identified about another 80 suspicious domains linking to my site. They audit concluded that my site is suffering a partial Penguin penalty due to low quality links. My question is whether it is safe to remove these 125 links from the low quality directories. I am concerned that removing this quantity of links all at once will cause a drop in ranking because the link profile will be thin with only about 125 domains remaining that point to the site. Granted those 125 domains should be of somewhat better quality. I am playing with fire by having these removed. I URGENTLY NEED ADVICE AS THE WEBMASTER HAS INITIATED STEPS TO REMOVE THE 125 LINKS. Thanks everyone!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0