Subdomain 403 error
-
Hi Everyone,
A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else?
I would love to hear your thoughts.
Jens -
no at all
-
Hi Roman,
Thanks for your answer!
It's a commercial tool.
I checked the robots.txt file and .htaccess, but didn't saw any problems.
As you say, the problem can just be caused by the user-agent.If so, this will not affect my SEO efforts, right?
-
Which tool are you using is this a custom tool or commercial tool such as Screamingfrog?
-
These are all for client errors. That means the page wasn’t found and something is wrong with the request. Whatever is happening though, the issue is typically on the client side:
403: Forbidden, So In your case, the first place that you need to check is your .htaccess and your Robots.txt file and make sure that they are not blocking any crawler or at least the crawler of your tools.
For example, some Hosting providers block all the crawlers that are not Google or Bing to save resources. So is usual that Roger (Moz Crawler) has problems to crawl a page that is blocked on the server side. Usually, Moz, Ahrefs, Semrush has this kind of problem so in summary
- Make sure your .htaccess and your Robots.txt is not blocking your crawler
- Make sure your hosting is not blocking your crawler
- If all the above does not work try to modify the user-agent of your tool
Hope this info helps you with your problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Error after scanning with browseo.net
Good day! I have done a scan on my site with browseo.net ( and a few other similar scanners ) and got the mess seen in the screenshot. I've tried deleting all the files in the website folder, replace it with a single image file, but it still shows the same error. What could this mean and should i be worried? P.S Found my answer after contacting the helpful support of browseo.net : It took me some time to figure out what was going on, but it seems as if you are mixing content types. Browsers are quite smart when it comes to interpreting the contents, so they are much more forgiving than we are. Browseo crawls your website and detects that you are setting utf-8 as part of the meta information. By doing so, it converts the content in a different character encoding then what they are supposed to be. In a quick test, I tried to fetch the content type based on the response object, but without any success. So I am suspecting that in reality your content is not utf-8 encoded when you parse it into joomla. The wrong character type is then carried over for the body (which explains why we can still read the header information). All of this explains the error. In order for it to work in browseo, you’d have to set the content type correctly, or convert your own content into utf-8 before parsing. It may be that you are either storing this incorrectly in the database (check your db settings for a different content type other than utf-8) or that other settings are a bit messed up. The good news is, that google is probably interpreting your websites correctly, so you won’t be punished for this, but perhaps something to look into… From Paul Piper VKNNnAL.png?1
Technical SEO | | AlexElks0 -
"Ghost" errors on blog structured data?
Hi, I'm working on a blog which Search Console account advises me about a big bunch of errors on its structured data: Structured data - graphics Structured data - hentry list Structured data - detail But I get to https://developers.google.com/structured-data/testing-tool/ and it tells me "all is ok": Structured data - test Any clue? Thanks in advance, F0NE5lz.png hm7IBtV.png aCRJdJO.jpg 15SRo93.jpg
Technical SEO | | Webicultors0 -
4XX client error
I am a bit confused...my recent site crawl told me I had 1 4XX Client error, (high priority). This is the page...
Technical SEO | | sdwellers
http://www.seadwellers.com/wp-content/uploads/2014/06/367679d2+0+277-SD.mp4 This link below is listed as the "linking page"....I guess that the link comes from?
http://www.seadwellers.com/category/dive-travel/ I'm just not getting this...where did the page of the first link above come from...and what is the deal with the catagory/dive-travel/ page? And how do I fix? Any guidance would be greatly appreciated...0 -
Will deleting Wordpress tags result in 404 errors or anything?
I want to clean up my tags and I'm worried I'm going to look in my webmasters the next day with hundreds of errors. Whats the best way of doing this?
Technical SEO | | howlusa0 -
UK and US subdomain. Can both rank for some keyword terms?
One of my clients has one root domain http://www.website.com and there are two versions, the US and the UK. So there are two subdomains uk.website.com and us.website.com. Both subdomains contain similar content/landing pages and are going after the same keywords. One site is supposedly crawled by UK crawlers but still shows up in US-based SERPS. Will Google take into account that both subdomains are going for the same keyword terms and only rank one of them? How is this kind of thing handled?
Technical SEO | | C-Style0 -
Subdomain mozTrust - does other parkd domains can affect that ?
Hi , I have my domain www.mydomain.com and it have dpmain authority 26 , domain mozRank around 3 , domain mozTrust 1.63 , page authority 31, Google PR 2.0 etc etc So I am not in very bottom of scores, but my SUBDOMAIN MOZTRUST is only 0.961 and I've checked other websites that I've made some time ago and they have it like 4.0. So it is quite bad. I am having some domain parked within my hosting package. they have different names like www.mydomain2.co.uk , www.mydomain3.com etc. I can acces those domains as wel by typing : mydomain2.mydomain.com mydomain3**.mydomain.com** and have some testing subdomains there as well (just if I need to test something like drupal , wordpress or testing shoping cart etc.) Can that fact affect my subdomain rank ? Because I am having those domains parked there or I've made some subdomains that are not in use and nobody is linking to them and they are visible in Google ?
Technical SEO | | sever3d0 -
Subdomains at Yola, Blogger, Wordpress
If the purpose of constructing a site or blog is for SEO ie a linking microsite, is it better to keep as a subdomain or to register on its own domain. The question is how much of the Domain Authority of that site will flow through the subdomain to linked site. I note that these subdomains have PA of 1, does this answer my own question?? Thanks eg widgets.yolasite.com or widgets.wordpress.com
Technical SEO | | seanmccauley0