Has anyone else gotten strange WMT errors recently?
-
Yesterday, one of my sites got this message from WMT:
"Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.0%."
I did a fetch as Googlebot and everything seems fine. Also, the site is not seeing a decrease in traffic.
This morning, a client for which I am doing some unnatural links work emailed me about a site of his that got this message:
"Over the last 24 hours, Googlebot encountered 1130 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%."
His robots.txt looks fine to me.
Is anyone else getting messages like this? Could it be a WMT bug?
-
DNS servers are just like any other server, Marie - they can have outages, downtime and configuration problems.If the Googlebot visited while your DNS server was burping, it might have received no response, hence the error warning. When you checked, the server may have settled down.
There are a number of best practices for good DNS hygiene, but my primary one is to monitor the uptime of your DNS the same way you do the uptime of your website. I use my paid subscription to Pingdom Tools to do this as one of my checks, but I'm sure many other uptime monitoring tools can do it as well.
The reason I monitor is that it can be a really helpful early warning system for potential upcoming severe problems (and can help explain otherwise unexplained site outages). With one client, we saw a steadily increasing number of errors over a few days (over 40 outages on the last day), leading us to change DNS hosting before things could fail completely and leave us in the lurch.
In addition, I always recommend against having the DNS hosted on the same server as the website, as would happen with cPanel DNS hosting, for example. Reason being, if you have severe prolonged server issues, you can't get at your DNS to change it quickly to somewhere else temporarily (even if just to host an explanatory error message)
I also like to ensure the DNS is hosted somewhere with good geographic redundancy so even if one nameserver goes out, there are still multiple backups to keep things rolling. No matter how good your website's uptime is, if your DNS dies, you're still off line.
My guess is the DNS server was having temporary issues that resolved by the time you checked it. I'd want to be sure that wasn't happening on a regular basis. (relying on Google to report issues isn't nearly accurate or timely enough),
As far as the robots.txt - do you have uptime monitoring on that site? I can't count the number of new clients who thought things were fine with their website, when in fact they were having constant short outages that went unnoticed as they weren't on their own site constantly enough to catch it. I always recommend a system that checks at 1-minute intervals for just this reason. If you don't have independent verification that the site was fully up, you can't really discount the WMT warnings safely.
Lemme know if you want more info on uptime monitoring services & methods.
Paul
-
Yikes. That would not be good!
-
Wait until they tell you that they are taking your adsense account down in 72 hours... and you know that they have an algo problem... when you tell them that a noob employee who doesn't know the rules that you display ads under tells you that you are down to 48 hours.
-
Thx. All checks out well on the dns check. I'm calling this a bug.
-
I know it I've had much higher traffic spikes than anything I've seen recently and still they sent this message. It's bizarre! But definitely not one that worries me. I'm like you, I do not get excited when I see a message in that inbox...
Anyway just thought it fit because it was so strange and seemingly unnecessary.
-
If you checked and all is fine, it maybe a be a temporary bug, it happens from time to time.
The 100% rate could be that it was only one crawl, hence 100% error rate (1 of 1), just wait for the next crawl.Anyway, in the meantime, check your domain with a DNS checker, you can use www.dnsstuff.com, www.intodns.com or dnscheck.pingdom.com/ to make sure everything is working correctly or to see if there's anything you need to take to your hosting provider.
-
Haha Jesse! I'd rather have that message. That is a weird one. I have had things go super viral and I've never had a message telling me of an INCREASE in traffic. Some of them have been increased Google searches too...not just direct or Facebook visits.
I have had a message that there was a decrease in traffic for my top URL once. This was when one of my sites had a slight Panda hit.
I think the messages are very random.
Whenever I see a (1) next to messages in WMT my heart races a little. It's usually a good thing because I am waiting to hear back from a reconsideration request for a client though.
-
Well I got this message recently:
Search results clicks for http://www.---------- have increased significantly.
This message is not indicative of any problem in your site. It is simply to inform you that the number of clicks that one of your pages receives has increased recently. If you have just added new content, this may indicate that it has become more popular on Google. The number of clicks that your site receives from Google can change from day to day for a variety of factors, including automatic algorithm updates.I found it strange because everything looks about normal. Sure we had a bit better of a day than usual but just barely.. Nothing I'd even blink twice at.
It's strange because this is only the second time I've ever received a message in GWT. But hey, I'm not complaining about this one.
Probably unrelated to what you're describing but just thought I'd share.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Assistance with High Priority Duplicate Page Content Errors
Hi I am trying to fix the high priority duplicate content URL's from my recent MOZ crawl (6 URL's) in total. Would someone from the community be able to offer some web development advice? I had reached out on the Moz Community on the main welcome page. Samantha stated that someone in web development on Moz's Q&A forum would be better suited to assist me. I took a word press class on Lynda.com, but other than that, I am a novice. I manage my site www.rejuvalon.com on Go Daddy's managed wordpress site. Thanks so much for your help! Best, Jill
Technical SEO | | justjilly0 -
Why is Coyscape showing content duplication error even after implementing 301 redirect ?
We are maintaining the corporate website of one of our prestigious clients "FineTech Toolings" (http://www.finetechtoolings.in). Recently I had raised a question regarding "2 websites running paralley in 2 diferent domains, i.e. 1 organisation having 2 different websites on 2 different domains". Recently my domain changed from http://www.finetechtoolings.co.in to http://www.finetechtoolings.in via 301 redirect, but still I am facing content duplication issue as per Copyscape. Hence I am having a small doubt regarding the same. Please note the following question very carefully and provide me the exact problem and the solution for the same: Even though I have implemented 301 redirect (http://www.finetechtoolings.co.in is redirected to http://www.finetechtoolings.in), which is completely ok as per the SEO rules, why is copyscape still showing that duplicate content exists in the former website? I think I am clear enough with my question.
Technical SEO | | KDKini0 -
Google Webmaster Structured Data Error
In google webmaster tool in Structured data it is showing me 396 items with errors i.e. Data Type - Product, Source - Markup:schema.org, Pages -351, Items -351, Items with Errors - 351 When i click on the 351 in that it is showing Missing:Price but when i click on that product i can see the price 2) Data Type - searchresultspage, Source - Markup:schema.org, Pages- 47, Items - 47 Items with errors -45 When i click on the 47 in that it is showing Missing:Price but when i click on that product i can see the price So i am not getting what is the actual error?
Technical SEO | | jackinmathis10 -
Google Webmasters News Errors ressolution
Hello to the community, i had a sudden increase from just a couple to 50 someting Google Webmaster News Errors. The two areas affected are Content of article and date of article.I found a very good article in SEOMoz about Google Webmasters, but it was published before the changes early last year were done in Google Webmasters. http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools The people that have been asking the same question in the internet have not yet received replies from Google and the Google support replies dont make it really clear. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93994 Any views experiences with this. My site is in Google News, but we do not have a Google News Sitemap. Thanks, Polar
Technical SEO | | Polarstar0 -
500 error codes caused by W3 Total Cache plugin?
Hello Everyone, I operate a site (http://www.nationalbankruptcyforum.com) that has been receiving 500 error codes in Webmaster Tools as of late. This morning, webmaster tools showed 129 500 crawling errors. I've included one of the URLs that contained an error message here: http://www.nationalbankruptcyforum.com/marriage-and-bankruptcy/do-my-wife-and-i-both-have-to-file-for-bankruptcy/ I've been getting these errors now for about 3 weeks and they've mostly been on obscure, strange URLs (lots of numbers etc.) however, this morning they started showing up on pages that will actually be trafficked by users. I'm really not sure where they're coming from, although I do believe it's a software issue as I've had my hosting company take a look to no avail. I have had some development work done recently and am running the W3 Total Cache plugin (my site is built on WP). I also run the Yoast SEO plugin and rely on it to publish an XML sitemap among other things. Anyone have any idea where these 500 errors originate from? Thanks, John
Technical SEO | | oconn1460 -
RSS Feed Errors in Google
We recently (2 months ago) launched RSS feeds for the category pages on our site. Last week we started seeing error pages in Webmaster Tools' Crawl Errors report pop up for feeds of old pages that have been deleted from the site, deleted from the sitemap, and not in Google's index since long before we launched the RSS feeds. Example: www.mysite.com/super-old-page/feed/ I checked and both the URL for the feed and the URL for the actual page are returning 404 statuses. www.mysite.com/super-old-page/ is also showing up in our Crawl Errors. Its been deleted for months but Webmaster Tools is very slow to remove the page from their Crawl Error report. Where is Google finding these feeds that never existed?
Technical SEO | | Hakkasan0 -
Why do I get duplicate content errors just for tags I place on blog entries?
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags. Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map. I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data. As seen in GWT: Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states
Technical SEO | | GerryWeitz0 -
404 errors on a 301'd page
I current have a site that when run though a site map tool (screaming frog or xenu) returns a 404 error on a number of pages The pages are indexed in Google and when visited they do 301 to the correct page? why would the sitemap tool be giving me a different result? is it not reading the page correctly?
Technical SEO | | EAOM0