Google couldn't access your site because of a DNS error
-
Hello,
We've being doing SEO work for a company for about 8 months and it's been working really well, we've lots of top threes and first pages.
Or rather we did.
Unfortunately the web host who the client uses (who to recommended them not to) has had severe DNS problems. For the last three weeks Google has been unable to access and index the website. I was hoping this was going to be a quickly resolved and everything return to normal. However this week their listing have totally dropped, 25 page one rankings has become none, Google Webmaster tools says 'Google couldn't access your site because of a DNS error'. Even searching for their own domain no longer works!
Does anyone know how this will effect the site in the long term? Once the hosts sort it out will the rankings bounce back. Is there any sort of strategy for handling this problem? Ideally we'd move host but I'm not sure that is possible so any other options, or advice on how it will affect long term rankings so I can report to my client would be appreciated.
Many thanks
Ric
-
Hi BWRic,
Sorry for getting back to you so late, the problem seemed to be resolved but the website is having troubles again, anyway, thanks a lot for your help and advice.
Best regards,
Daniel.
-
Hi Daniel,
I'm afraid I don't know the specifics as the hosting company were very secretive and awkward. However what I do know is that their firewalls were incorrectly flagging Google as trying to perform and DDOS attack on the server. By this I presume they meant Google's spiders were being blocked. I don't know any more details than that but I hope it gives you a starting point to work from.
Regards
Ric
-
Hi BWRic,
Could you please tell me how did you resolve the issue? I am having this very same problem with a website which I have been working on, I would really appreciate your advice.
Thanks in advance
Daniel.
-
A little update for everyone. The problem has been resolved now for nearly two weeks (seems the firewall thought Google was a DDOS attack!) so I've been able to monitor the early response. It looks like the site is bouncing back well to where it previously was.
-
Adam is right, downtime like this is unacceptable and this should be the card you play to convince the client to change hosts. You have the data you need (25 page one rankings dropped) to support your argument. The costs involved with moving to the new host will be worth it. Oftentimes you can even see hosting cost savings by switching to a better host.
If you can't move, yes your rankings should come back after the site is re-indexed. The only strategy I am aware of to handle this issue is to use a host that has more redundancy built-in. It doesn't sound like the local provider is able to provided this in-house and in the future they may themselves need to use an off-site location for hosting their servers.
-
The website seems accessible to everyone but Google, if it was fully down wwe'd get them to move ASAP. We're definitely going to try and convince our client to move again now we've some ammo!
The site is actually hosted with our local telecommunications provider and it wasn't just the web hosting that was effected, nearly everyone on the island where I live had very unreliable internet connectivity for a few weeks, they're just lucky they have little to no competition.
-
3 weeks of downtime or DNS issues is an incredibly long time and is absolutely unacceptable for any webhost. I would say definitely move hosts. No matter what it takes, make it happen.
I would expect there will be little or no long term effect on the site's rankings, but I'm not 100% sure (I've never worked with a site that had severe uptime issues for more than a couple days).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
Is this okay with google if i can access my sub categories from two different path?
My website is url is abcd.com. One of my category url is abcd.com/mobile.aspx. Which contains 5 sub categories :- samung Mobile 2) Nokia Mobile 3) Sony Mobile 4) HTC Mobile 5) Blackberry Mobile Now if i go in to HTC Mobile sub categories i.e. abcd.com/htcmobile.aspx here i will see all the product related to HTC Mobile. But at below of all product i will find all sub categories that is samsung mobile, nokia mobile, sony mobile and blackberry mobile. So i want to task is this okay? Google will not count these categories as duplicate that is i can access all 4 categories i.e. samsung, nokia, sony and blackberry from here 1) abcd.com/mobile.aspx and 2) abcd.com/htcmobile.aspx Thanks! Dev
Technical SEO | | devdan0 -
Google indexing staging / development site that is redirected...
Hi Moz Fans! - Please help. We had a acme.stagingdomain.com while a site was in development, when it went live it redirected (302) to acmeprofessionalservices.com (real names redacted!!) no known external links to staging site although staging site url has been emailed from Google Apps(!!!) now found that staging site is in the index even though it redirects to the proper public site. and some (but not all) of the pages are in the index too. They all redirect to the proper public site when visited. It is convenient to have a redirect from the staging site to the new one for the team, Chrome etc. remember frequently visited sites. Be a shame to lose that. Yes, these pages can be removed using webmaster tools.
Technical SEO | | mozroadjan
But how did they get in the index to start with? And if we're building a new site, and a customer has an existing site is there a danger of duplicate content etc. penalties caused by the staging site? We had a similar incident recently when a PDF that was not linked anywhere on the site appeared in the index. The link had been emailed through Google Apps, and visited in Chrome, but that was it. So 3 questions. Why is the staging site still in the index despite the redirects? How did they get in the index in the first place? Will the new staging site affect the rank of the existing site, eg. duplicate content penalties?0 -
Why can't I redirect 302 errors to 301's?
I've been advised by IT that due to the structure of our website (they don't use sub-folders) it's not possible to change 302's to 301's. Is this correct, or am I being fobbed off?
Technical SEO | | lindsaytuerena0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0