Access Denied - 2508 Errors - 403 Response code in webmaster tools
-
Hello Fellow members,
From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed.
Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP
on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors "
Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.
After this all problem started. Kindly tell what is the issue & how can I solve this.
-
Hi There
Without seeing your website it's hard to tell for sure. But a 403 error usually has to do with permissions (who/what your server will allow to access the content).
Have you recently put anything behind a password?
If you have Screaming Frog SEO Spider you can try setting it to Googlebot as the user agent and try crawling your site.
You can also use a header checker like URI Valet to see what server response is returned. It sounds like Googlebot is getting one response while normal browsers are seeing it fine (200 codes).
If you are absolutely not sure, and can not share your site name, I would contact your webhost to look into any issues with the server.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Screaming Frog tool left me stumped
Hi there again, I found a major cloaking hack in our client's website that is really well camouflaged and all the seo tools that I tried to help me check for cloaking couldn't find it. I know that screaming frog is a great tool and I want to use it to help me, however, I can't seem to get my way around their system that I downloaded. Can you help me with the screaming frog program? Do you know where I can make a full site check for cloaking, maybe there are more links that I wasn't notified about? I would really appreciate if you could help me with that. Thanks so much, Ruchy
White Hat / Black Hat SEO | | Ruchy2 -
Can a Self-Hosted Ping Tool Hurt Your IP?
Confusing title I know, but let me explain. We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates. This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP. Thoughts?
White Hat / Black Hat SEO | | David-Kley0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
Potential Implications of using the Disavow tool to remove thousands of links
So here's the situation. My companies site has over 30 thousand backlinks from Rippling.info These links all point to 3 product pages, some of which are no longer in production. Apparently a former employee was experimenting with some link farm ideas. My questions are; 1. does anyone here have experience with rippling.info? Is it legit? It seems like a link farm but Google allows adsense ads??? I thought Google was against link farms... 2. if I use the Disavow tool in Webmaster Tools to tell Google these 30k+ incoming links are to be ignored, will there be any consequences? -Google Analytics shows zero referral traffic since jan 1st 2012.
White Hat / Black Hat SEO | | mjmorse0 -
Is widget linkbaiting a bad idea now that webmasters are getting warnings of unnatural links?
I was reading this article about how many websites are being deindexed because of an unnatural linking profile and it got me thinking about some widgets that I have created. In the example given, a site was totally deindexed and the author believes the reason was because of multiple footer links from themes that they created. I have one site that has a very popular widget that I offer to others to embed into their site. The embed code contains a line that says, "Tool provided by Site Name". Now, it just so happens that my site name contains my main keyword. So, if I have hundreds of websites using this tool and linking back to me using the same anchor text, could Google see this as unnatural and possibly deindex me? I have a few thoughts on what I should do but would love to hear your thoughts: 1. I could use a php script to provide one of several different anchor text options when giving my embed code. 2. I could change the embed code so that the anchor text is simply my domain name, ie www.mywebsitename.com rather than "my website name". Thoughts?
White Hat / Black Hat SEO | | MarieHaynes1 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0