Wordpress error
-
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading:
User-agent: *
Crawl-delay: 20User-agent: 008
Disallow: /I'm wondering how I can fix this and stop it happening again.
The site was hacked about 4 months ago but I thought we'd managed to clear things up.
Colin
-
This will be my first post on SEOmoz so bear with me
The way I understand it is that robots read the robots.txt file from top to bottom, and once they find a rule that applies to them they stop reading and begin crawling. So basically the robots.txt written as:
User-agent:*
Disallow:
Crawl-delay: 20
User-agent: 008
Disallow: /
would not have the desired result as user-agent 008 would first read the top guideline:
User-agent: *
Disallow:
Crawl-delay: 20
and then begin crawling your site, as it is first being told that All user-agents are disallowed to crawl no pages or directories.
The corrected way to write this would be:
User-agent: 008
Disallow: /
User-agent: *
Disallow:
Crawl-delay: 20
-
Hi Peter,
I've tested the robot.txt file in Webmaster Tools and it now seems to be working as it should and it seems Google is seeing the same file as I have on the server.
I'm afraid this side of things isn't' my area of expertise so it's been a bit of a minefield.
I've taken a subscription with sucuri.net and taken various other steps that hopefully will hel;p with security. But who knows?
Thanks,
Colin
-
Google is seeing the same Robots.txt content (in GWT) that you show in the physical file, right? I just want to make sure that, when the site was hacked, no changes were made that are showing different versions of files to Google. It sounds like that's not the case here, but it definitely can happen.
-
Blog isn't' showing now and my hosts say that the index.php file is missing from the directory but I can see it.
Strange.
Have contacted them again to see what the problem can be.
Bit of a wasted Saturday!
-
Thanks Keith. Just contacting out hosts.
Nightmare!
-
Looks like a 403 permissions problem, that's a server side error... Make sure you have the correct permissions set on the blog folder in IIS Personally I always host on Linux...
-
Mind you the whole blog is now showing an error message and cant' be viewed so looks like an afternoon of trial and error!
-
Thanks very much Keith. I've just edited the file as suggested.
I see the error but as I am the web guy I cant' figure out how to get rid of it.
I think it might be a plugin that's causing it so I'm going to disable the and re-able them one as a time.
I've just PM'd you by the way.
Thanks for your help Keith.
Colin
-
Use this:
**User-agent: * Disallow: /blog/wp-admin/ Disallow: /blog/wp-includes/ Sitemap: http://nile-cruises-4u.co.uk/sitemap.xml**
Any FYI, you have the following error on your blog:
Warning: is_readable() [function.is-readable]: open_basedir restriction in effect. File(D:\home\nile-cruises-4u.co.uk\wwwroot\blog/wp-content/plugins/D:\home\nile-cruises-4u.co.uk\wwwroot\blog\wp-content\plugins\websitedefender-wordpress-security/languages/WSDWP_SECURITY-en_US.mo) is not within the allowed path(s): (D:\home\nile-cruises-4u.co.uk\wwwroot) in D:\home\nile-cruises-4u.co.uk\wwwroot\blog\wp-includes\l10n.php on line **339 **
Get your web guy to look at that, it appears at the top of every blog page for me...
Hope that helps,
Keith
-
Thanks Keith.
Only part of our site is WP based. Would that be a problem using the example you kindly suggested?
-
I gave you an example of a basic robots.txt file that I use on one of my Wordpress sites above, I would suggest using that for now.
I would not bother messing around with crawl delay in robots.txt as Peter said above there are better ways to achieve this... Plus I doubt you need it any way.
Google caches the robots.txt info for about 24hrs normally in my experience... So it's possible the old cached version is still being used by Google.
-
Hi Guys,
Thanks so much for your help. As you say Troy, that's defintely not what I want.
I assumed when we were hacked (twice in 8 months) that it might have been a competitor as we are in a very competitive niche. Might be very wrong there but we have certainly lost our top ranking on Google.co.uk for our main key phrases and our now at about position 7 for the same key phrases after about 3 years at number 1.
So when I saw on Google Webmaster Tools yesterday that we had a severe health warning and that the Googlebot was being prevented crawling our site I thought it might be the aftereffects of the hack.
Today even though I changed the robot.txt file yesterday GWT is showing 1000 pages with errors, 285 Access Denied and 719 Not Found and this message: Googlebot is blocked from http://nile-cruises-4u.co.uk/
I've just tested the robot.txt via GWT and now get this message:
AllowedDetected as a directory; specific files may have different restrictionsSo maybe the pages will be able to access by Googlebot shortly and the Access Denied message will disappear.I've chaged the robot.txt file to
User-agent: *
Crawl-delay: 20But should I change it to a better version? Sorry guys, I'm an online travel agent and not great on coding and really techie stuff. Although I'm learning pretty quickly about the bad stuff!I seem to have a few problems getting this sorted and wonder if this is a part of why our page position is dropping? -
I would simplify your robots.txt to read something like:
**User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://www.your-domain.com/sitemap.xml**
-
That's odd: "008" appears to be the user agent for "80legs", a custom crawler platform. I'm seeing it in other Robots.txt files.
-
I'm not 100% sure what he's seeing, but when I plug his robots.txt into the robots analysis tool, I get this back:
Googlebot blocked by line 5: Disallow: /
Detected as a directory; specific files may have different restrictions
However, when I gave the top "**User-agent: ***" the "Disallow: " it seemed to fix the problem. Like, it didn't understand that the **Disallow: / **was meant only for the 008 user-agent?
-
Not honestly sure what User-agent "008" is, but that seems harmless. Why the crawl delay? There are better ways to handle that than Robots.txt, if a crawler is giving you trouble.
Was there a specific message/error in GWT?
-
I think, if you have a robots.txt reading what you show above:
User-agent: * Crawl-delay: 20
User-agent: 008 Disallow: /
That just basically says, "Don't crawl my site at all" (The "Disallow: /" means, I'm not allowing anything to be crawled by any search engine that pays attention to robots.txt at all)
So...I'm guessing that's not what you want?
(Bah..ignore. "User-agent". I'm a fool)
Actually, this seems to have solved your issue...make sure you explicitly tell all other User-agents that they are allowed:
User-agent: * Disallow: Crawl-delay: 20
User-agent: 008 Disallow: /
The extra "Disallow:" under User-agent: * says "I'm not going to disallow anything to most user-agents." Then the Disallow under user-agent 008 seems to only apply to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain migration via Wordpress - organic still 60% down four months later
Hi guys!Almost four months ago I performed a Wordpress domain migration. Three pet-based sites were migrated into a new pet-based one that incorporated them all - the new site is petskb.com and 240 posts were migrated.The site migration was performed via 301 wildcard re-directs using the .htaccess files in the old domains, which are still in place and working. I also used the site move tool in GSC. Afterwards, I performed an audit of the new site to ensure that all the old urls were being re-directed to the new one, which they were (and are). There have been no manual actions reported in GSC.The results have been very poor. A small few of the articles that were in the top 10 moved over and I quickly claimed the same positions in the new site. Most did not though and still sit >100 in the SERP or absolutely nowhere (or even omitted) using the main keyword.I've created about 60 new articles (using the same SEO analysis I did previously) since that time on the new site and not one of them has ranked <100 in all that time, whereas on the old sites they would initially rank somewhere in the top 50 after a couple of days and work their way up over the months. These new posts haven't moved though. The posts that were published on the new site four months ago are still in the exact same position.So, I've created new content, re-submitted the sitemap and manually requested re-indexing of the posts. Nothing has changed. I've hired SEO's and not one has found any problems with my site or how I performed the migration. Clearly there is a problem though. The original posts that were ranking previously and all the new posts have not moved in the SERP. There were a few spammy links pointing to the new domain but nothing significant, I did disavow these though - no more than on the old sites though.As a test, I created a new post on another domain which has no posts with the same long-tail keyword as one that has been on my new site for almost four months. The one I posted on the new domain out-ranked the one on petskb after just two days.Can anyone help? If you can I will personally travel to where you live and buy you several beers.Thanks,Matt
Intermediate & Advanced SEO | | mattpettitt1 -
Schema Review For Attorney wordpress website
I have a wordpress website for an attorney. I would like to implement schema markup rating/review. My client has really good reviews and I want to show the 5 stars in the SERPs. His rating is 5 out 5 stars. There are different plugins that are good only to leave a review, and the stars would only appear only for that review page in the SERPs. I would like the home page to have the stars in the SERPs. Is there a way to get it done? Thanks!
Intermediate & Advanced SEO | | Armen-SEO0 -
Category 404 Error in Wordpress | Help!!!
Hello Gurus hope everyone is having a fantastic day. Right so I've been pulling my hair with this 404 error on links as such: htttp://manvanlondon.co.uk/category/clients/removals/man-and-van-wandsworth This link appears in the category page clients and the /removals/man-and-van-wandsworth part is the link that should take the user on the Man and Van Wandsworth page in the/ from the footer. However this link and all other links in the footer on these category pages/posts appear to be broken ONLY on this category pages, thus creating 404 errors. And those pages(i.e man and van Wandsworth) are not even categorized. The website is www.manvanlondon.co.uk . We tried various things on Wordpress and nothing is working including non-indexing. Has anyone met this problem before? Is there a way to fix it? Thank you for your time, and hope my explanations make sens. Monica
Intermediate & Advanced SEO | | monicapopa0 -
How to make sure dev site is not index in wordpress and how would it be affected?
hi guys! I'm currently having a dev version of my site (dev.website.com) that once everything is done i would move the dev to the public domain (website.com). But since is a total duplicate content of my real site would it affect the seo? if so, i tried setting the reading privacy in wordpress so google would not index it but im afraid when i live it in the future and revert the setting back to normal it would affect the site seo. any opinion and suggestion on this?
Intermediate & Advanced SEO | | andrewwatson920 -
Why are Pages returning 404 errors not being dropped?
Our webmaster tools continues to return anywhere upwards of 750 pages that have 404 errors. These are from pages of a previous site no longer used. However this was over 1 year ago these pages were dropped along with the 301 re-directs. Why is Google not clearing these from webmaster tools but re-listing them again after 3 month cycle? Is it because external sites have links to these pages? If so should I put a 301 in place (most of these site are forums and potentially dodgy directories etc from previous poor link building programs) or ask for a manual removal?
Intermediate & Advanced SEO | | Towelsrus0 -
Category Base in Wordpress
I notice that there is a checkbox on the Yoast SEO Plugin for "strip category base". What is the reason to do this? What are the SEO advantages and disadvantages?
Intermediate & Advanced SEO | | casper4340 -
Duplicate page titles Wordpress SEO/Yoast
Hi I have a Wordpress site using the Wordpress SEO plugin by Yoast. Everything appears to be fine except that on 1 Feb SEOMoz crawl suddenly picked up a bunch of errors. The errors are duplicate page titles, and these exist only for the mysite.com/page/X pages. I can't find any setting in Yoast that looks wrong or tells me how to fix this. The pages are also dynamically canonicalizing to themselves - not sure if this makes any difference although I don't know how this is happening. Does anyone know how to fix this duplicate title error? Alex
Intermediate & Advanced SEO | | alextanner0 -
Can 404 Errors Be Affecting Rankings
I have a client that we recently (3 months ago) designed, developed, and launch a new site at a "new" domain. We set up redirects from the old domain to the new domain and kept an eye on Google Webmaster Tools to make sure the redirects were working properly. Everything was going great, we maintained and improved the rankings for the first 2 months or so. In late January, I started noticing a great deal of 404 errors in Webmaster Tools for URLs from the new site. None of these URLs were actually on the current site so I asked my client if he had previously used to domain. It just so happens that he used the domain a while back and none of the URLs were ever redirected or removed from the index. I've been setting up redirects for all of the 404s appearing in Webmaster tools but we took a pretty decent hit in rankings for February. Could those errors (72 in total) been partially if not completely responsible for the hit in rankings? All other factors have been constant so that lead me to believe these errors were the culprits.
Intermediate & Advanced SEO | | TheOceanAgency0