Time to deindexing: WMT Request vs. Server not found
-
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error.
I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that.
Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
-
Unfortunately, Google may continue to keep those pages in its index for months, even if they return a 404. The 2 best options in these cases is usually:
- Claim the profile in GWT - which would probably be possible but requires a lot of work with Godaddy configuring the subdomains just so you could claim the profile and de-index.
- I haven't tried it, but Google introduced a URL removal tools for URLs you don't controll. Might be a good use case here. Here's some info: http://googlewebmastercentral.blogspot.com/2013/12/improving-url-removals-on-third-party.html
-
Ive seen this a couple times
It does go away eventually.
-
No they were not duplicates. They all just showed a soft 404 provided by goDaddy. We had wildcards turned on, but even so I don't understand how Google found these. They were just not used for anything ever i.e. vww.example.com
People have pointed to them as something wonky, so I'm trying to get rid of them in case they are hurting our site's overall performance in the SERP.
-
This will eventually stop the pages being indexed yes. It may take several days in some cases but they will go.
Were these subdomains duplicates of your main domain? If so you could try 301 redirecting them as this could speed the process up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt on http vs. https
We recently changed our domain from http to https. When a user enters any URL on http, there is an global 301 redirect to the same page on https. I cannot find instructions about what to do with robots.txt. Now that https is the canonical version, should I block the http-Version with robots.txt? Strangely, I cannot find a single ressource about this...
Technical SEO | | zeepartner0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
Hi can anyone let me know which is the better server
hi, i am trying to find out which is the better dedicated server and would like your opinion. the first one is Dell PowerEdge 😄 Intel Xeon E3-1220L, 2.2GHz Dual-Core
Technical SEO | | ClaireH-184886
4GB DDR3 RAM
2 x 500GB SATA HDD
Linux/Windows
10000GB Monthly Transfer
Up to 2 IP Addresses
LSI Raid Card and the second one is, Intel Atom 330 1MB L2 Cache 1.6GH 500GBStorage
4GBRAM
10TBBandwidth if you can please let me know the difference and which one is better for speed and for memory for a large site. many thanks0 -
Home page deindexed by google
when I search my website on google by site:www.mydomain.com I have found my domain with www has been de-indexed by google, but when I search site:mydomain.com, my home page--**mydomain.com **show up on the search results without www, put it simple, google only index my domain without www, I wonder how to make my domain with www being indexed, and how to prevent this problem occure again.
Technical SEO | | semer0 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
Are these 'not found' errors a concern?
Our webmaster report is showing thousands of 'not found' errors for links that show up in javascript code. Is this something we should be concerned about? Especially since there are so many?
Technical SEO | | nicole.healthline0 -
Every time google caches our site it shows no website.
Our site <cite>www.skaino.co.uk/</cite> seems to be having real issues with being picked up with Google. The site has been around for a long time but no longer even ranks on google if you search for the word 'Skaino'. This is odd as its hardly a competitive keyword. If I do a site:www.skaino.co.uk then it shows all the pages proving the site has been indexed. But if I do cache:www.skaino.co.uk it shows a blank cache. I'm starting to worry that Google isn't able to crawl our site properly. If it helps to clarify we have a flash site with a HTML site running underneath for those who cant view flash. Im wandering if I've missed something glaringly obvious. Is it normal to have a blank google cache? Thanks AJ
Technical SEO | | handygammon0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | | briankb0