Access Denied - 2508 Errors - 403 Response code in webmaster tools
-
Hello Fellow members,
From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed.
Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP
on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors "
Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.
After this all problem started. Kindly tell what is the issue & how can I solve this.
-
Hi There
Without seeing your website it's hard to tell for sure. But a 403 error usually has to do with permissions (who/what your server will allow to access the content).
Have you recently put anything behind a password?
If you have Screaming Frog SEO Spider you can try setting it to Googlebot as the user agent and try crawling your site.
You can also use a header checker like URI Valet to see what server response is returned. It sounds like Googlebot is getting one response while normal browsers are seeing it fine (200 codes).
If you are absolutely not sure, and can not share your site name, I would contact your webhost to look into any issues with the server.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors For Pages That Never Existed
I'm seeing a lot of 404 errors with slugs related to cryptocurrency (not my website's industry at all). We've never created pages remotely similar, but I see a lot of 404 errors with keywords like "bitcoin" and "litecoin". Any recommendations on what to do about this? Another keyword is "yelz". It usually presents like .../yelz/-ripper-vs-steller/ or .../bitcoin-vs-litecoin/. I don't really even have the time to fix all the legitimate 404 errors, let alone these mysterious requests. Any advice is appreciated.
White Hat / Black Hat SEO | | bcaples1 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Why should I reach out to webmasters before disavowing links?
Almost all the blogs, and Google themselves, tell us to reach out to webmasters and request the offending links be removed before using Google's Disavow tool. None of the blogs, nor Google, suggest why you "must" do this, it's time consuming and many webmasters don't care and don't act. Why is this a "required" thing to do?
White Hat / Black Hat SEO | | RealSelf0 -
I Mistakenly uploaded Disavow File to Non WWW version of Webiste in Webmaster Tools...is this a Problem???
Hey guys and gals, I need some advice on this please. I recently had someone perform a negative S.E.O campaign on my site and I was inundated with 13,000 + spammy links pointing to my website and I had to perform a Disavow in Google Webmaster Tools but for some reason, it is showing that I uploaded the Disavow text file to the Non WWW version of the website but the WWW version of my website is the preferred domain and I have all NON WWW queries being 301 redirected to www.pcmedicsoncall.com My question is should I correct this and upload the Disavow text to the preferred domain in Google Webmaster Tools??? Please advise on how I should proceed with this situation.... Thank you. Cam
White Hat / Black Hat SEO | | CamMcArthur0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0