403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl errors
Hi I have the following errors on my site and was wondering would it help improve my ranking to fix : Missing Meta Description Tag 137Duplicate Page Title 17Title Element is Too Long 6Temporary Redirect 3
On-Page Optimization | | WallerD0 -
Reducing number crawl-able links?
Hello, I just like to ask for best practice when it comes to reduce number of internal links on a site with a mega menu. Since the mega menu lists all categories and all their subcategories it creates a problem when all categories are linking to all categories directly.. Would the method below reduce the number of links and preventing the link juice flowing directly from category to category? [(link built with JavaScript and the html5 "data-" attribute) Thinking of using these links to categories in the menu not directly below the parent category.](#)
On-Page Optimization | | AJPro0 -
Is Anybody Familiar With This SEO Reporting Tool?
Just for kicks, I decided to pay 5 bucks for an SEO Report on a website I was working on for a client. Yea FIVERR. The report was actually not too bad and I was able to clean up a bunch of stuff. Here is a link to other sites on the internet who use this tool. I would like to use it for myself so I'm trying to figure out the source of the report. Here are a few urls to the reports. Remove the two stars in the URL. I don't want to give them a backlink. http://rapid**purple.com/services/free-seo-analysis/
On-Page Optimization | | Czubmeister
http://www.doc**stoc.com/docs/93426462/SEO-Plan Has anybody used these or know the source? -Bob1 -
Handling a Huge Amount of Crawl Errors
HI all, I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit. 404 Erorrs: >80'000 Soft 404 Errors: 300 500 Errors: 1600 All of the above reported in GWT. Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing. What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind. So my question is: Generally, what is the appropriate way of handling this? Telling the client that he has to investigate that (I gave my best to at least report the errors) Engaging my firm further and get a developer from my side to investigate? Thanks in advance!!
On-Page Optimization | | spiderz0 -
Popup windows are coming up as 404 error in moz reports.
We have several links showing up as 404 errors because of the way our site is set up. I want to know if this is hurting our ranking because Google sees it as 404 errors or is it just something I can ignore because it works for the user? If it is hurting our quality and therefore our rankings, how can I correct it so that best practices are used? I have attached an image of the links and here is an example page --> http://www.sourcemedicalequipment.com/Perch-Polyurethane-Industrial-Stool-18-25-p/idst2.htm The links in the description section have anchor text "All 3 Choices" and result in a popup information page. If you cut and paste the URL's that they are redirecting to end in a 404 but if you use the link on the page it results in a popup information window. Hope I explained that well. Thanks for your help! wrEmQnwWQo
On-Page Optimization | | BenRWoodard0 -
Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
Hello... I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content. Can someone, anyone, please help me understand: how does SEOMOZ determine if i have duplicate content Is it correct ? Are there really that many pages of duplicate content How do i fix this, if true <---- ** Most important ** Thanks in advance for any help!!
On-Page Optimization | | Prime850 -
Crawl error: duplicate title for home page
I'm seeing a duplicate title for the home page, both the static file name and the domain. like: http://domain.com
On-Page Optimization | | joshcanhelp
http://domain.com/index.cfm I know how to set this in Google Analytics but how would I make sure this isn't seen as an error? It's accounting for both a duplicate title and duplicate content. Thanks!0 -
How long after a URL starts showing a 404 does Google stop crawling?
Before hiring me to do SEO, a client re-launched their site and did not 301 the old URLs to the new. Only the home page URL stayed the same. For a month after the re-launch, the old URLs returned a 404. For the next month, all 404 pages (basically any non-existent URL) were 301'd to the home page. Finally, 2 months after launching, they properly 301'd the old URLs to the new. Now, the new URLs are not ranking well. I assume it's too late to realize any benefit from the 301's, just checking to see if anybody has any insight into how long Google keeps trying to crawl old/404/improperly 301'd URLs. Thanks!
On-Page Optimization | | AndrewMiller0