Wordpress error
-
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading:
User-agent: *
Crawl-delay: 20User-agent: 008
Disallow: /I'm wondering how I can fix this and stop it happening again.
The site was hacked about 4 months ago but I thought we'd managed to clear things up.
Colin
-
This will be my first post on SEOmoz so bear with me
The way I understand it is that robots read the robots.txt file from top to bottom, and once they find a rule that applies to them they stop reading and begin crawling. So basically the robots.txt written as:
User-agent:*
Disallow:
Crawl-delay: 20
User-agent: 008
Disallow: /
would not have the desired result as user-agent 008 would first read the top guideline:
User-agent: *
Disallow:
Crawl-delay: 20
and then begin crawling your site, as it is first being told that All user-agents are disallowed to crawl no pages or directories.
The corrected way to write this would be:
User-agent: 008
Disallow: /
User-agent: *
Disallow:
Crawl-delay: 20
-
Hi Peter,
I've tested the robot.txt file in Webmaster Tools and it now seems to be working as it should and it seems Google is seeing the same file as I have on the server.
I'm afraid this side of things isn't' my area of expertise so it's been a bit of a minefield.
I've taken a subscription with sucuri.net and taken various other steps that hopefully will hel;p with security. But who knows?
Thanks,
Colin
-
Google is seeing the same Robots.txt content (in GWT) that you show in the physical file, right? I just want to make sure that, when the site was hacked, no changes were made that are showing different versions of files to Google. It sounds like that's not the case here, but it definitely can happen.
-
Blog isn't' showing now and my hosts say that the index.php file is missing from the directory but I can see it.
Strange.
Have contacted them again to see what the problem can be.
Bit of a wasted Saturday!
-
Thanks Keith. Just contacting out hosts.
Nightmare!
-
Looks like a 403 permissions problem, that's a server side error... Make sure you have the correct permissions set on the blog folder in IIS Personally I always host on Linux...
-
Mind you the whole blog is now showing an error message and cant' be viewed so looks like an afternoon of trial and error!
-
Thanks very much Keith. I've just edited the file as suggested.
I see the error but as I am the web guy I cant' figure out how to get rid of it.
I think it might be a plugin that's causing it so I'm going to disable the and re-able them one as a time.
I've just PM'd you by the way.
Thanks for your help Keith.
Colin
-
Use this:
**User-agent: * Disallow: /blog/wp-admin/ Disallow: /blog/wp-includes/ Sitemap: http://nile-cruises-4u.co.uk/sitemap.xml**
Any FYI, you have the following error on your blog:
Warning: is_readable() [function.is-readable]: open_basedir restriction in effect. File(D:\home\nile-cruises-4u.co.uk\wwwroot\blog/wp-content/plugins/D:\home\nile-cruises-4u.co.uk\wwwroot\blog\wp-content\plugins\websitedefender-wordpress-security/languages/WSDWP_SECURITY-en_US.mo) is not within the allowed path(s): (D:\home\nile-cruises-4u.co.uk\wwwroot) in D:\home\nile-cruises-4u.co.uk\wwwroot\blog\wp-includes\l10n.php on line **339 **
Get your web guy to look at that, it appears at the top of every blog page for me...
Hope that helps,
Keith
-
Thanks Keith.
Only part of our site is WP based. Would that be a problem using the example you kindly suggested?
-
I gave you an example of a basic robots.txt file that I use on one of my Wordpress sites above, I would suggest using that for now.
I would not bother messing around with crawl delay in robots.txt as Peter said above there are better ways to achieve this... Plus I doubt you need it any way.
Google caches the robots.txt info for about 24hrs normally in my experience... So it's possible the old cached version is still being used by Google.
-
Hi Guys,
Thanks so much for your help. As you say Troy, that's defintely not what I want.
I assumed when we were hacked (twice in 8 months) that it might have been a competitor as we are in a very competitive niche. Might be very wrong there but we have certainly lost our top ranking on Google.co.uk for our main key phrases and our now at about position 7 for the same key phrases after about 3 years at number 1.
So when I saw on Google Webmaster Tools yesterday that we had a severe health warning and that the Googlebot was being prevented crawling our site I thought it might be the aftereffects of the hack.
Today even though I changed the robot.txt file yesterday GWT is showing 1000 pages with errors, 285 Access Denied and 719 Not Found and this message: Googlebot is blocked from http://nile-cruises-4u.co.uk/
I've just tested the robot.txt via GWT and now get this message:
AllowedDetected as a directory; specific files may have different restrictionsSo maybe the pages will be able to access by Googlebot shortly and the Access Denied message will disappear.I've chaged the robot.txt file to
User-agent: *
Crawl-delay: 20But should I change it to a better version? Sorry guys, I'm an online travel agent and not great on coding and really techie stuff. Although I'm learning pretty quickly about the bad stuff!I seem to have a few problems getting this sorted and wonder if this is a part of why our page position is dropping? -
I would simplify your robots.txt to read something like:
**User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://www.your-domain.com/sitemap.xml**
-
That's odd: "008" appears to be the user agent for "80legs", a custom crawler platform. I'm seeing it in other Robots.txt files.
-
I'm not 100% sure what he's seeing, but when I plug his robots.txt into the robots analysis tool, I get this back:
Googlebot blocked by line 5: Disallow: /
Detected as a directory; specific files may have different restrictions
However, when I gave the top "**User-agent: ***" the "Disallow: " it seemed to fix the problem. Like, it didn't understand that the **Disallow: / **was meant only for the 008 user-agent?
-
Not honestly sure what User-agent "008" is, but that seems harmless. Why the crawl delay? There are better ways to handle that than Robots.txt, if a crawler is giving you trouble.
Was there a specific message/error in GWT?
-
I think, if you have a robots.txt reading what you show above:
User-agent: * Crawl-delay: 20
User-agent: 008 Disallow: /
That just basically says, "Don't crawl my site at all" (The "Disallow: /" means, I'm not allowing anything to be crawled by any search engine that pays attention to robots.txt at all)
So...I'm guessing that's not what you want?
(Bah..ignore. "User-agent". I'm a fool)
Actually, this seems to have solved your issue...make sure you explicitly tell all other User-agents that they are allowed:
User-agent: * Disallow: Crawl-delay: 20
User-agent: 008 Disallow: /
The extra "Disallow:" under User-agent: * says "I'm not going to disallow anything to most user-agents." Then the Disallow under user-agent 008 seems to only apply to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SSL Cert error
Just just implemented SSL with a wild card cert and I got an email from google that my non-www cert is not valid. Any ideas ? SSL/TLS certificate does not include domain name https://electrictime.com/ To: Webmaster of https://electrictime.com/, Google has detected that the current SSL/TLS certificate used on <a>https://electrictime.com/</a> does not include <a>https://electrictime.com/</a> domain name. This means that your website is not perceived as secure by some browsers. As a result, many web browsers will block users accessing your site by displaying a security warning message. This is done to protect users’ browsing behavior from being intercepted by a third party, which can happen on sites that are not secure.
Intermediate & Advanced SEO | | ThomasErb0 -
Consolidate URLs on Wordpress?
Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.
Intermediate & Advanced SEO | | wickstar1 -
New blog post URLs due to WordPress permalink structure changes. Any SEO repercussions?
A client site had the follwing URLs for all blog posts: www.example.com/health-news/sample-post www.example.com/health-news is the top level page for the blog section. While making some theme changes during Google mobilegeddon, the permalink structure got changed to www.example.com/sample-post ("health-news" got dropped from all blog post URLs). Google has indexed the updated post structure and older URLs are getting redirected (if entered directly in the browser) to the new ones; it appears that WordPress takes care of that automatically as no 301 redirects were entered manually. It seems that there hasn't been any loss of rankings (however not 100% sure as the site ranks for well over 100 terms). Do you suggest changing the structure back to the old one? Two reasons that I see are preserving any link juice from domains linking to old URLs and ensuring no future/current loss of rankings.
Intermediate & Advanced SEO | | VishalRayMalik0 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
Intermediate & Advanced SEO | | vmialik0 -
Can Wordpress plugins like WPLeadMagnet and PopUp Domination damage SEO? | Please give thumbs up here ➜
Hello guys, Can Wordpress plugins like WPLeadMagnet and PopUp Domination damage SEO? http://wpleadmagnet.com/ http://www.popupdomination.com/ I don't know what Googles thinks about using for example wpleadmagnet.com in order to catch exit & bounce traffic. What are you guys think? If i use wpleadmagnet.com can i expect to maintain my good positions in SERP or not?
Intermediate & Advanced SEO | | EestiKonto0 -
How Should I Start Using SEO For Wordpress
I want to ranked my all keywords on top 10 position but unforunately only few keywords are in top 10 position.. Now i already using some recommend plugins like Yoast, Easy WP SEO. So my main question is how should i start promoting my wordpress topic. I already submitting to social network but not getting enough traffic except Stumbleupon... I am using SEO software like SenukeX and Bookmarking Demon but still no luck.. i am not getting quality backlinks.... please help i am very frustrated... please someone give proper guideline.. every day is over by thinking the strategy...
Intermediate & Advanced SEO | | mamuti0 -
Rich Snippets Publisher errors
Hi all. Happen to do a bit of testing with some of our microformat and microdata markup when I noticed our linked Google+ Publisher markup has stopped working. It definitely was working, and nothings changed, but now we are flagging errors, and I've noticed some of our competitors also have the same problem. publisher linked Google+ page = https://plus.google.com/103929635387487847550
Intermediate & Advanced SEO | | sjr4x4
Error: This page does not include verified publisher markup. Learn more. If I actually add a duplicate rel="publisher" then I get the following results: Extracted Author/Publisher for this page publisherlinked Google+ page = https://plus.google.com/103929635387487847550
Error: This page does not include verified publisher markup. Learn more. publisherlinked Google+ page = https://plus.google.com/103929635387487847550/ The second line doesn't seem to flag an error? I know this is still all pretty new, so is anyone else having problems or odd results, or is Google having some problems? All our other rich snippets such as reviews etc are working fine, just seems to be the publisher bit. cheers Steve0 -
Google Webmaster Tools Sitemap errors for phantom urls?
Two weeks ago we changed our urls so the correct addresses are all lowercase. Everything else 301 redirects to those. We have submitted and made sure that Google has downloaded our updated sitemap several times since. Even so, Webmaster Tools is reporting 33000 + errors in our sitemap for urls that are no longer in our sitemap and haven't been for weeks. It claims to have found the errors within the last couple of days but the sitemap has been updated for a couple of weeks and has been downloaded by Google at least three times since. Here is our sitemap: http://www.aquinasandmore.com/urllist.xml Here are a couple of urls that Webmaster Tools says are in the sitemap: http://www.aquinasandmore.com/catholic-gifts/Caroline-Gerhardinger-Large-Sterling-Silver-Medal/sku/78664
Intermediate & Advanced SEO | | IanTheScot
Redirect error unavailable
Oct 7, 2011
http://www.aquinasandmore.com/catholic-gifts/Catherine-of-Bologna-Small-Gold-Filled-Medal/sku/78706
Redirect error unavailable
Oct 7, 20110