403 forbidden error how to solve them
-
hi, i have been using a great tool today called screaming frog which was shown to me by Thomas Zickell
when i used the tool i found some worrying things for my site www.in2town.co.uk. what i have found is, i have a large number of 403 forbidden status on my home page and i do not know why
here is an example
http://www.in2town.co.uk/emmerdale/emmerdale-debbie-hits-rock-bottom
it loads fine but on the tool it shows it as an error and shows it as having no meta tags or anything but there is meta tags in there
can anyone please let me know how to solve this and why it has happened
many thanks
-
Hi Tim,
Glad it helped. It might be worth asking your host what kind of features they have for preventing flooding attacks, there are various ways of addressing them on the server side that most hosts will have enabled in one way or another. Unless you have a specific issue with these kind of attacks, it seems to me that this part of the module is causing more harm than good as it is now.
-
thank you for this. i have turned it off and will speak to sh404sef to find out what they can do about it, as i am worried about having the security feature, but as you said that was the problem and now the site is showing fine, there are no errors showing.
many thanks for this. I hope other people who are having this problem get to read this post as they must be going through what i am going through. many thanks for all your help and the solution
-
Hi Tim,
Did you ever get to the bottom of the issue mentioned in this question? It is almost certainly the same problem.
Have a look at this page and try either turning of the sh404SEF anti flooding feature or else boosting the max number of requests allowed. http://forum.joomla.org/viewtopic.php?p=1368937
The anti flooding part of this component is basically blocking requests for pages if it thinks someone is trying to do a dos attack on your site. The current setup seems to be too sensitive and is bloocking screaming frog after the first few requests, quite possibly blocking the google bots, maybe blocking the moz crawler also, so certainly something you should address.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird 404 errors in Webmaster Tools
Hi, In a regular check with Webmaster Tools, I have noticed some weird 404 errors, for example, my domain URL is something like http://domainname.com/, the 404 error points to some weird URLs like http://domainname.com/james-bond&page=2/ and http://domainname.com/juegos-de&page=3/, at first I have tried to block them by robots.txt, but now I am getting these kind of 404 errors a lot, and don't think blocking them all is a perfect solution. Can anyone help me out with the issue? Thank you in advance.
Technical SEO | | nishthaj
cheers.0 -
Increase in Crawl Errors
I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png
Technical SEO | | abisti20 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Responsive web design has a crawl error of redirecting to HTTP instead of HTTPS ? is this because of the new update of google that appreciates the HTTPs more?
We at yamsafer.me are using a Repsonsive web design! A crawl errors occured which redirects the hompage to an HTTP version instead of HTTPS? Any ideas on why this happened?
Technical SEO | | Yamsafer.com0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
I have a 404 error on my site i can't find.
I have looked everywhere. I thought it might have just showed up while making some changes, so while in webmaster tools i said it was fixed.....It's still there. Even moz pro found it. error is http://mydomain.com/mydomain.com No idea how it even happened. thought it might be a plugin problem. Any ideas how to fix this?
Technical SEO | | NateStewart0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
Nginx 403 and 503 errors
I have a client with a website that is hosted on a shared webserver running on an Nginx server. When I started working on the website a few months ago I found the server was throwing 100s of 403s and 503s and at one point googlebot couldn't access robots.txt. Needless to say this didn't help rankings! Now the web hosting company has partially resolved the errors by switching to a new server and I'm now just seeing intermittent spikes in Webmaster Tools of 30 to 70 403 ad 503 errors. My questions: Am I right in saying there should (pretty much) be no such errors (for pages that we make public and crawlable). Having already asked the web hosting company to look in to this. Any advice on specifically what I should be asking them to look at on the server? If this doesn't work out, does anyone having a recommendation for a reliable web hosting company in the U.S. for a lead generation website with over 20,000 pages and currently 500 to 1000 visits per day? Thanks for the help Mozzers 🙂
Technical SEO | | MatShepSEO0