403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF Instructions come up in Crawl report as Duplicate Content
Hello, My ecommerce site has many PDF instruction pages that are being marked as duplicate content in the site crawl. Each page has a different title, and then a PDF displayed in an iframe with a link back to the previous page & to the category that the product is placed in. Should I add text to the pages to help differentiate them? I included a screenshot of the code that is on all the pages. Thanks! Justin 9tD9HMr
On-Page Optimization | | JustinBSLW0 -
Massive increase in Moz crawl.
I have a subdomain which has just started to be crawled by Moz, Previously this wasn't the case. The sub-domain had 16,000+ issues. Why has Moz started to count sub-domains as part of the main domain, has Google started to do this aswell?
On-Page Optimization | | danwebman0 -
On Page Grade Reports - Which to optimize first?
I'm wanting to prioritize my on-page optimization efforts by doing the work that will have the most impact first. Let's say, hypothetically, that this was my on-page report card: Grade A - 60 reports
On-Page Optimization | | justin-brock
Grade B - 20 reports
Grade C - 70 reports
Grade D - 70 reports
Grade F - 300 reports Where is the biggest opportunity for increasing good traffic? Doing more work on Grade A pages to ensure I continue to rank Moving mid-grade pages up to high-grade pages (e.g., raising a B to A, or a C to B) Moving low-grade pages up to mid-grade pages (e.g., raising a D to C, or a F to D)0 -
I want to check which pages have been crawled
I would like to find out which pages have been crawled by seomoz on my site
On-Page Optimization | | seoworx1230 -
Popup windows are coming up as 404 error in moz reports.
We have several links showing up as 404 errors because of the way our site is set up. I want to know if this is hurting our ranking because Google sees it as 404 errors or is it just something I can ignore because it works for the user? If it is hurting our quality and therefore our rankings, how can I correct it so that best practices are used? I have attached an image of the links and here is an example page --> http://www.sourcemedicalequipment.com/Perch-Polyurethane-Industrial-Stool-18-25-p/idst2.htm The links in the description section have anchor text "All 3 Choices" and result in a popup information page. If you cut and paste the URL's that they are redirecting to end in a 404 but if you use the link on the page it results in a popup information window. Hope I explained that well. Thanks for your help! wrEmQnwWQo
On-Page Optimization | | BenRWoodard0 -
Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
Hello... I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content. Can someone, anyone, please help me understand: how does SEOMOZ determine if i have duplicate content Is it correct ? Are there really that many pages of duplicate content How do i fix this, if true <---- ** Most important ** Thanks in advance for any help!!
On-Page Optimization | | Prime850 -
How long after a URL starts showing a 404 does Google stop crawling?
Before hiring me to do SEO, a client re-launched their site and did not 301 the old URLs to the new. Only the home page URL stayed the same. For a month after the re-launch, the old URLs returned a 404. For the next month, all 404 pages (basically any non-existent URL) were 301'd to the home page. Finally, 2 months after launching, they properly 301'd the old URLs to the new. Now, the new URLs are not ranking well. I assume it's too late to realize any benefit from the 301's, just checking to see if anybody has any insight into how long Google keeps trying to crawl old/404/improperly 301'd URLs. Thanks!
On-Page Optimization | | AndrewMiller0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0