Robots.txt error message in Google Webmaster from a later date than the page was cached, how is that?
-
I have error messages in Google Webmaster that state that Googlebot encountered errors while attempting to access the robots.txt. The last date that this was reported was on December 25, 2012 (Merry Christmas), but the last cache date was November 16, 2012 (http://webcache.googleusercontent.com/search?q=cache%3Awww.etundra.com/robots.txt&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a).
How could I get this error if the page hasn't been cached since November 16, 2012?
-
That's what our next move is. I'll let you all know what comes of it. Thanks for the response!
-
I've noticed several discrepancies in google's cache system. It seems many of their documents lag and are not updated immediately. It could just be errors in cross data population. If you really want to know the last time the Google bot visited your website then you will want to visit your server logs.
If your server logs don't show a visit from google on the 25th then we really do have to wonder. My guess is that the webmaster tools is reflective of the correct date. Either way, I'd check for the error they are reporting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Landing pages, are my pages competing?
If I have identified a keyword which generates income and when searched in google my homepage comes up ranked second, should I still create a landing page based on that keyword or will it compete with my homepage and cause it to rank lower?
Intermediate & Advanced SEO | | The_Great_Projects0 -
Why Is Google Webmaster Tools Pulling Zero Keyword Data?
I just linked a Google Webmaster Tools account to Google Analytics for a client, and Search Engine Optimization reports are showing up in Google Analytics as enabled, but there is zero keyword data, landing page data, etc., in the reports themselves. Has anyone encountered this?
Intermediate & Advanced SEO | | yoursearchteam0 -
Help with Robots.txt On a Shared Root
Hi, I posted a similar question last week asking about subdomains but a couple of complications have arisen. Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one. Thank you in advance.
Intermediate & Advanced SEO | | Whittie0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
How concerning is a message from Google about an increase in server errors?
In the past few weeks I have started getting messages from Google webmasters about an increase in server errors. According to our r&d team these messages come at times our site has been down and Google is not an accurate measure of the site health. 1 - are they correct and is there a better tool to be using? 2 - could be harmed that Google is occasionally running into this problem..that is then fixed within a few hours? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
301 redirect changed googles cached title tags ??
Hi, This is a new one to me ?! I recently added some 301 redirects from pages that I've removed from my site. Most of them just redirect to my home page, whilst a few redirect to appropriate replacement pages. The odd thing is that when I now search my keywords googles serp shows my website with a title that was on some of the old (now removed and redirected) pages. Is this normal? If so, how should I prevent this from happening? What is going on? The only reasons I set up the redirects was to collect any link juice from the old pages and prevent 404s. Should I remove the 301s? I fetched as google and submitted - to see if that updates the tags. (not been indexed yet) Any help would be appreciated. Kind Regards Tony
Intermediate & Advanced SEO | | thephoenix250 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
How to determine URL Parameters in Google Webmaster
Hi there! I have a new website with so many duplicate meta titles and descriptions because of its expanded features from the e-commerce shopping cart that I am using like mobile website, product sorting, etc. Aside from canonical, is it advisable to use the URL parameters from Google webmaster tools to disallow crawling of mobile website and other parameters like, "parent", "catalogsetview", "pcsid", "pg" "mode". I appreciate and advise. 🙂 Thanks!
Intermediate & Advanced SEO | | paumer800