Google cant read my robots.txt from past 10 days
-
http://awesomescreenshot.com/08d1s6aybc
hi, my robots.txt is
http://wallpaperzoo.com/robots.txt
google says it cant read and has postponed the crawl.. its been 10 days and no crawl.. please help me in solving this issue.. this is save with
-
Hi sir Google started fetching again
thank you for your help
-
What you can do is from within Google Webmaster Tools is to ask Google to fetch the URL of the sitemap and then 'submit it to the index'. That's one technique I've seen working before.
Google Webmaster Tools or Googlebot doesn't seem to crawl robots.txt files very often. Have seen it a couple of times where it could take days for them to crawl the file again after I've changed something.
-
hi sir, there was a problem on 23rd september in robots.txt which i fixed on 25th after then google bot dint come.. how much time it will take to come back again?
-
There's nothing wrong with your robots.txt file as far as I can see. It is likely that Google tried accessing the robots.txt file when your server was slow/down for a split second and that's why you are seeing the error. Are you sure that the site hasn't been crawled in 10 days? Have you checked your server log files to verify that? Regardless, once Googlebot does eventually come back, it should read the robots.txt file correctly and then you should be good to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hit by an unnamed Google update on November 30th - Still suffering
Hi, Moz Community. Just decided to sign up for a free trial because I'm absolutely at my wits end here. Here's my site: cheapgamesguru.com I run a small PC gaming blog monetized by affiliate marketing. I do all the writing, SEO, etc. myself. The content I write is, from what I can tell, fully complying with Google's guidelines, and in and of itself is pretty informative and high-quality. My site was started in December of 2015, and it was doing very well for a good 10 or 11 months - until late November of 2016. Then something happened. My traffic started plummeting - I went from getting nearly 300 organic users a day (Not sessions - actual unique users) to 80, then 40, and now I'm lucky to get over 15 a day. I do not do ANY black hat SEO whatsoever. I have not taken part in any shady link building schemes, nor do I try to trick Google in any way. I just write good content, do good keyword research (Targeting only low-hanging fruit and low-difficulty keywords using KWFinder), and do my best to provide a good user experience. I run no ads on my site. Glenn Gabe wrote about a potential Google update on November 29th, but the stuff he said in his article doesn't seem to affect me - my mobile site is perfectly fine, according to Google's own metrics and testing tools. Here's the article in question: http://www.gsqi.com/marketing-blog/november-30-2016-google-algorithm-update/ At first, I thought it was possible that this was a result of my competitors simply doing far better than me - but that doesn't seem to be the case, as their rankings did not actually move - mine simply pummeted. And many of their sites are far worse than mine in terms of grammar, spelling, and site speed. I understand backlinks are important, by the way, but I really don't think that's why my site was hit. Many competitors of mine have little to no backlinks and are doing great, and it would also not make much sense for Google to hit an otherwise great site just because they have few backlinks. A friend of mine has reached out to Glenn Gabe himself to see if he can get his input on my site, but he's had a busy schedule and hasn't gotten a chance to take a look yet. I recently obtained a backlink from a highly relevant DA 65 site (About a month ago, same niche as my site), and it now shows up in Search Console and Ahrefs - but it hasn't affected rankings whatsoever. Important Note: I'm not only just ranking poorly for stuff, I'm ranking in position 100-150+ for many low-competition keywords. I have no idea why that is happening - is my site THAT bad, that my content deserves to be ranking on page 15 or lower? Sorry for the long question. I'm struggling here, and just wanted to give as much information as possible. I would really appreciate any input you guys can give me - if any SEO experts want to turn my site into a case study and work with me to improve things, I'd also be open to that 😉 I kid, of course - I know you guys are all busy. Thanks! P.S. I've attached a picture of my SEMRush graph, for reference, as well. mhgSw
Algorithm Updates | | polycountz0 -
Googlebot soon to be executing javascript - Should I change my robots.txt?
This question came to mind as I was pursuing an unrelated issue and reviewing a site's robots/txt file. Currently this is a line item in the file: Disallow: https://* According to a recent post in the Google Webmasters Central Blog: [http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better") Googlebot is getting much closer to being able to properly render javascript. Pardon some ignorance on my part because I am not a developer, but wouldn't this require Googlebot be able to execute javascript? If so, I am concerned that disallowing Googlebot from the https:// versions of our pages could interfere with crawling and indexation because as soon as an end-user clicks the "checkout" button on our view cart page, everything on the site flips to https:// - If this were disallowed then would Googlebot stop crawling at that point and simply leave because all pages were now https:// ??? Or am I just waaayyyy over thinking it?...wouldn't be the first time! Thanks all! [](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better")
Algorithm Updates | | danatanseo0 -
Same search term shows #1 on Bing but #140 on Google?
Hi, I am using the search term of my website domain i.e. "Series Digital" on both Bing and Google. Bing shows my website as the top most link. But on Google, my website appears on page 14!! Why is this happening when I am using the string within the " "?
Algorithm Updates | | Cloudguru990 -
Organic listing & map listing on 1st page of Google
Hi, Back then, a company could get multiple listings in SERP, one in Google Maps area and a homepage or internal pages from organic search results. But lately, I've noticed that Google are now putting together the maps & organic listings. This observation has been confirmed by a couple of SEO people and I thought it made sense, but one day I stumble with this KWP "bmw dealership phoenix" and saw that www.bmwnorthscottsdale.com has separate listing for google places and organic results. Any idea how this company did this? Please see the attached image
Algorithm Updates | | ao5000000 -
Someone just told me that the Google doesn't read past the pipe symbol. I find that hard to believe. Is this true?
Someone just told me that the Google doesn't read past the pipe symbol.
Algorithm Updates | | MarketingAgencyFlorida0 -
Google above the fold update
Hi everyone, Ever since the Jan 19th Google 'above the fold update' I have noticed some strange ranking changes in some of my sites. 1. rankings increased dramatically (not in top 50 to page 2) on Jan 19th for about 5 days then dropped out completely from the top 50. 2. our rankings then did the same thing again around Feb 2nd for about 5 -6 days then has bottomed out ever since. We do not have any ads on the site but our pages are dominated by images for most of the 'above the fold' section then followed by the content down the page. Any insight into this would be much appreciated. Cheers, Andrew
Algorithm Updates | | jay.raman0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1 -
Shortened Title in Google Places/Local Results in SERPs
I've been doing some local SEO lately and noticed something today. When I do a search for "State/town name Cat Toys", I see the title tag of the website in the local results as opposed to the business name. I'm happy they are showing up above the normal results, but I wonder if having the brand name at the end of the site title impacts clicks. For example: Site name: New Hampshire Cat Toys and Accessories | Cats R Us But in the places results the title is cut short because they show the address, so all they see is: New Hampshire Cat Toys and.... Do you think branding is especially important in local results? Or less important? I could hear arguments for both sides. I realize the site URL is shown in green below the title, but it's not the same as having a brand in the title portion. It also looks like some of the competition has just their name show up as opposed to their website title. Is this something I can fix in Google Places, or is something Google does on its own? Cheers, Vinnie
Algorithm Updates | | vforvinnie1