Site (Subdomain) Removal from Webmaster Tools
-
We have two subdomains that have been verified in Google Webmaster Tools. These subdomains were used by 3rd parties which we no longer have an affiliation with (the subdomains no longer serve a purpose).
We have been receiving an error message from Google: "Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.00%". I originally investigated using Webmaster Tools' URL Removal Tool to remove the subdomain, but there are no indexed pages. Is this a case of simply 'deleting' the site from the Manage Site tab in the Webmaster Tools interface?
-
If the site has been already removed by Google then you're fine but i would still put a robots.txt block on it, since you don't know who turned it off. They probably pointed it away from your subdomain in their domain registrar account, which means they could repoint it again anytime if they wanted and the site would resolve again and get indexed again.
-
Hi Cary
From what you say it suggests to me the subdomains have gone / dead / been removed.
Therefore you only need to remove it from WMT and ensure only the correct people have access to your WMT / GA etc as I outlined above.
All the best
Nigel
-
Hi nigel555,
I was thinking along these lines, but I was concerned it was too simplistic of an approach. I was sure I was missing something.
How do you feel about the approach outlined above by Irving?
-
Thanks for the help Irving.
So you think we should block the subdomains with robots.txt files and then use the URL removal tool to remove the "sites" from Google's index.
I have done a site: search looking for indexed URLs from the subdomains, but the searches were empty. Do you think I should still use the URL removal tool?
-
Hi Cary, you are getting that message as the subdomain cannot be reached by Google.
On the Google Webmaster 'Home page' click the dropdown on the right - there should be 3 options.
First click 'Add or remove users' to ensure and deal with users you no longer want.
Then click 'Delete site' to remove the subdomain (make sure you delete the right one!)
Other things you may want to consider:
Check your main site has the right users accessing it.
Did you have Google Analytics on the subdomians - that data will still be there. You may want to consider who has access to that in future.
Check if the affiliates had any access to email, documents, regular reports or other tool you may want to lock down on.
All the best
Nigel
-
it sounds like they are no longer resolving? since they are subdomains to your site i would block them with robots.txt (making sure you don't accidentally block your main site) and request removal in WMT and get them both completely deindexed.
I would keep them verified in WMT so that you can see there is nothing being done with them by a third party, because you are responsible for your subdomain content since it's technically under your control.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
Can anyone tell me why some of the top referrers to my site are porn site?
We noticed today that 4 of the top referring sites are actually porn sites. Does anyone know what that is all about? Thanks!
Technical SEO | | thinkcreativegroup1 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
Webmaster Tools Manual Actions - Should I Disavow Spammy Links??
My website has a manual action against it in webmaster tools stating; Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole I have checked the link profile of my site and there are over 4,000 spammy links from one particular website which I am guessing this manual action refers to. There is no way that I will be able to get these links removed so should I be using Google's Disavow Tool or is there no need? Any ideas would be appreciated!!
Technical SEO | | Pete40 -
Webmaster Tools - Clarification of what the top directory is in a calender url
Hi all, I had an issue where it turned out a calender was used on my site historically (a couple of years ago) but the pages were still present, crawled and indexed by google to this day. I want to remove them now from the index as it really clouds my analysis and as I have been trying to clean things up e.g. by turning modules off, webmaster tools is throwing up more and more errors due to these pages. Below is an example of the url of one of the pages: http://www.example.co.uk/index.php?mact=Calendar,m1a033,default,1&m1a033year=2084&m1a033month=3&m1a033returnid=59&page=59?phpMyAdmin=xxyyzz The closest question I have found on the topic in Seomoz is: http://www.seomoz.org/q/duplicate-content-issue-6 I want to remove all these pages from the index by targeting their top level folder. From the historic question above would I be right in saying that it is: http://www.example.co.uk/index.php?mact=Calendar I want to be certain before I do a directory level removal request in case it actually targets index.php instead and deindexes my whole site (or homepage at the very least). Thanks
Technical SEO | | Mitty0 -
Webmaster tools
Hello, My sites are showing odd "links to your site" data in WMT. Its not showing any links to the homepages and reduced links for other pages. Anyone else seeing this? Penguin refresh maybe?
Technical SEO | | jwdl0 -
Help with Webmaster Tools "Not Followed" Errors
I have been doing a bunch of 301 redirects on my site to address 404 pages and in each case I check the redirect to make sure it works. I have also been using tools like Xenu to make sure that I'm not linking to 404 or 301 content from my site. However on Friday I started getting "Not Followed" errors in GWT. When I check the URL that they tell me provided the error it seems to redirect correctly. One example is this... http://www.mybinding.com/.sc/ms/dd/ee/48738/Astrobrights-Pulsar-Pink-10-x-13-65lb-Cover-50pk I tried a redirect tracer and it reports the redirect correctly. Fetch as googlebot returns the correct page. Fetch as bing bot in the new bing webmaster tools shows that it redirects to the correct page but there is a small note that says "Status: Redirection limit reached". I see this on all of the redirects that I check in the bing webmaster portal. Do I have something misconfigured. Can anyone give me a hint on how to troubleshoot this type of issue. Thanks, Jeff
Technical SEO | | mybinding10 -
Why does my site have a PageRank of 0?
My site (www.onemedical.com) has a PageRank of 0, and I can't figure out why. We did a major site update about a year ago, and moved the site from .md to .com about 9 months ago. We are crawled by Google and rank on the first page for many of our top keywords. We have a MozRank of 4.59. I figured this is something that would just take time to work out of the system, but nothing seems to change while we patiently wait. One more thing to note - when a user comes to the homepage (city selector) and selects their region they will then be cookied and directed to their relevant city site on subsequent visits. But even our city-specific pages (ie www.onemedical.com/sf) have pageranks of 0. My management team keeps asking me about this and I suspect there is something silly that we keep overlooking...but for the life of me, can't figure it out. Any help would be appreciated.
Technical SEO | | OneMedical0