How to fix and issue with robot.txt ?
-
I am receiving the following error message through webmaster tools
http://www.sourcemarketingdirect.com/: Googlebot can't access your site
Oct 26, 2012
Over the last 24 hours, Googlebot encountered 35 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.The site has dropped out of Google search.
-
Hi Stacey
What plugins do you have running - any caching plugins such the W3 Total Cache plugin?
Are you able to access your servers error logs to see if you can see anything there?
-
Thanks for your answer.
I have received this message from Google
**http://www.sourcemarketingdirect.com/ **using the Meta tag method (less than a minute ago). Your site's home page returns a status of 500 (Internal server error) instead of 200 (OK)
It looks like the permalink structure has changed but I'm not sure how.
-
I've seen several people ask this very same question over the last week in different forums. I am wondering if the major outages with hurricane Sandy have affected several hosts or DNS's.
Your robots.txt looks fine to me.
I'm guessing that you will completely recover once Google has a chance to fully crawl the site again.
-
just a quick check you have got wordpress visible to search engines set in the admin area? if not it will be set to disallow googlebot to crawl it.
it is in admin - options - privacy and select appropriate box - default is no index, no follow.
-
Thanks Matt.
There is no robots.txt as far as I can see. Is there a plugin I can use for wordpress?
The site was down for 2 days last month while hte original host transfered the site over to me.
Right now a site search says their are 13 pages indexed.
Just concerned that this site has always ranked number 1 for a company name search and now they are not on the first 10 pages in Google.
-
have you made sure your robots.txt is loading in your browser by adding robots.txt after your domain same as a normal page and can you see contents? has your site been down in this period? have you changed the contents of the file just before this issue? are you sure googlebot hasnt come back since that date - whats your analytics say? do an index site: search for your domain to see if it is in google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Crawler issues
Can anyone please suggest why our site is not being crawled by Google at the moment? Thanks,
Web Design | | CheethamBellJWT0 -
Is anyone using Humans.txt in your websites? What do you think?
http://humanstxt.org Anyone using this on their websites and if so have you seen and positive benefits of doing so? Would be good to see some examples of sites using it and potentially how you're using the files. I'm considering adding this to my checklist for launching sites
Web Design | | eseyo1 -
Robots.txt - Allow and Disallow. Can they be the same?
Hi All, I need some help on the following: Are the following commands the same? User-agent: * Disallow: or User-agent: * Allow: / I'm a bit confused. I take it that the first one allows all the bots but the second one blocks all the bots. Is that correct? Many thanks, Aidan
Web Design | | Presenter0 -
Panda and Penquin Fall - Could HTML Design an Issue?
Hi, We were hit hard by Panda 3.4 on March 23rd 2012. Then Penguin came along and slapped us down a little farther on April 24th. White hat SEO for 13 years on the site. I have been trying to discover the reason we got hit so hard, to date 90% down. We ae wiped. I have a couple of keywords still #2 and #3 and we see up and down changes in Google webmaster tools, i.e. a keyword is supposedly up 50 points then another down 50. All other 150 keywords that we used to rank on the first page for are not even showing up. I have a person that is about to do a full link analysis but since we never went after links I just never had the feeling that is where our problem is at, but definitely going to explore it. The reason for my post is that last night I spoke with an SEO person that has some pretty good credentials (9 years experience and works currently at large online marketing company with seo with clients like Honda) and he was nice enough to just take a quick look at the site. He said he saw nothing really wrong and did not think that we were hit for any of the normal issues people are listing, i.e. duplicate content, backlinks. His first impression was that we were knocked down because the site is "hard to index". He said the site still uses tables and a lot of our Doc Statements were for HTML 4.01 from 1999. As we all know, there are 'many' experts in this industry. So I wanted a little feedback from the community. Our main site was built in Dreamweaver using tables. We do have a Wordpress blog that is very small and just now posting to add fresh content. (posts seem to rank pretty good, this is why I thought, you know he may be right) Would an older site be penalized like this for using tables? What would you do at this stage if you had a site that is not recovering? I have now reached panic mode and have to do something, just not sure of the next step. I will be happy to post the URL if anyone wants to help with advice. Thanks,
Web Design | | Force7
Force70 -
Custom URL's with Bigcommerce Issue (Is it worth it?)
We're building out a store in Bigcommerce, who for all intensive purposes is perfect for SEO besides the fact that you can not change the URL's to be custom. My question is, does this kill the SEO value of bigcommerce, despite everything else being great? So for example the URL's for a category page would be something like this www.mysite.com/categories/keyword and the product URL's are pulled in by product name, so product URL's could be something like www.mysite.com/products/Product-Description-Long-223.html (notice the words will be capitalized and their is no way to remove the trailing .html) I could go with Interspire (the liscenced version of Bigcommerce) or Magento so I can custom edit this stuff. But then its a lot more work for my employee's on the buildout.
Web Design | | iAnalyst.com0