403s: Are There Instances Where 403's Are Common & Acceptable?
-
Hey All,
Both MOZ & Webmaster tools have identified 403 errors on an editorial site I work with (using Drupal CMS). I looked into the errors and the pages triggering the 403 are all articles in draft status that are not being indexed. If I am not logged into our drupal and I try to access an article in draft status I get the 403 forbidden error.
Are these 403's typical for an editorial site where editors may be trying to access an article in draft status while they are not logged in? Webmaster tools is showing roughly 350 pages with the 'Access Denied' 403 status.
Are these harmful to rank?
Thanks!
-
Hi guys and girls,
With regards to this, ss there any problem for SEO that you can see with banning spam refers through Htaccess?
So in these instances when a spam bots comes to the site, we throw a "Forbidden You don't have permission to access / on this server.", so basically a 403 error?
We're considering this as a more permanent solution to using filters in GA.
What do you think?
Thanks,
Gill.
-
Thanks Nicolas & Ruth. I realized the reason the pages are triggering errors is because there are published pages linking to those posts in draft. I imagine those draft posts were live at some point and then put back into draft.
Im guessing if I kill all the links to those draft posts that will fix the issue.
-
I agree that it's probably not a huge problem, but still something to clean up if you can - it would be best if crawlers weren't trying to access these pages.
-
If Moz and Webmaster Tools are showing the 403 error, it means that they are able to crawl to the URLs that are returning the 403 - so somewhere on your site or on the web, pages that are accessible by bots and crawlers are linking to these pages that don't exist yet. Having a bunch of errors on your site can impact Google's ability to crawl it well, which can impact your rankings, so it's best to get those cleaned up. In Webmaster Tools you should be able to click on the pages and see which pages are linking to them, so you can remove those links; you can do the same using a tool like Screaming Frog if you'd prefer. Good luck!
-
Hello,
No problem with these URLs, Drupal returns a 403 correctly because crawlers try to get draft content. Google and other search engines will not penalized you because of that : you can think of cart URLs or account URLs for ecommerce websites that may returns 403 the same way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'VE DONE EVERYTHING RIGHT BUT STILL GET LOW GOOGLE RANKING
Moz Pro Dashboard compared my site to three competitors' site. My site is better in every aspect. My DA is higher, I have four times more external links, I have more relevant content than my competition, my site also is the fastest, I have solved all redirect issues, there is no broken link internally or externally. We are custom home builders. So all our sites have numerous full screen images. I have optimized all of them. When I ran WebPageTest on my competition, they all fail. My site got A's in all categories. The three competitors' sites rank first, second and third. My site ranks bottom on the 2nd page. Since my site was online last January, I have spent thousands of dollars on SEO work. Its url is www.pokudesign.com I am hoping somebody can point me in the right direction. I am attaching a screen shot of my Moz Pro Dashboard. frEpZ
Moz Pro | | pokongku0 -
1500 Domains... Where to begin? & Web Structure Question.
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos. Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began. Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web. My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions... Am I right in thinking that we have to build a bridge before we burn a bridge? Is it worth fixing up some of those other domains to have original content to try and bolster what we already have? Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.? Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web. Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
Moz Pro | | HashtagHustler0 -
Moz has a problem connecting to my GA account. But it hasn't changed..?
A client recently rebuilt there site. Same URL, same Google Analytics code. When I log into my SEOmoz campaign for this site, I get this message: It appears there's a problem with our connection to your Google Analytics account. Please go to your Settings page to update your connection. There is nothing to update however. Nothing should have changed in my settings. Do I need to start over here? Any help is appreciated (I know analytics is working, checked it this morning.....)
Moz Pro | | cschwartzel0 -
Warnings, Notices, and Errors- don't know how to correct these
I have been watching my Notices, Warnings and Errors increase since I added a blog to our WordPress site. Is this effecting our SEO? We now have the following: 2 4XX errors. 1 is for a page that we changed the title and nav for in mid March. And one for a page we removed. The nav on the site is working as far as I can see. This seems like a cache issue, but who knows? 20 warnings for “missing meta description tag”. These are all blog archive and author pages. Some have resulted from pagination and are “Part 2, Part 3, Part 4” etc. Others are the first page for authors. And there is one called “new page” that I can’t locate in our Pages admin and have no idea what it is. 5 warnings for “title element too long”. These are also archive pages that have the blog name and so are pages I can’t access through the admin to control page title plus “part 2’s and so on. 71 Notices for “Rel Cononical”. The rel cononicals are all being generated automatically and are for pages of all sorts. Some are for a content pages within the site, a bunch are blog posts, and archive pages for date, blog category and pagination archive pages 6 are 301’s. These are split between blog pagination, author and a couple of site content pages- contact and portfolio. Can’t imagine why these are here. 8 meta-robot nofollow. These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes. 8 Blocked my meta-robots. And are also for the same 4 blog posts but duplicated twice each. We use All in One SEO. There is an option to use noindex for archives, categories that I do not have enabled. And also to autogenerate descriptions which I do not have enabled. I wasn’t concerned about these at first, but I read these (below) questions yesterday, and think I'd better do something as these are mounting up. I’m wondering if I should be asking our team for some code changes but not sure what exactly would be best. http://www.seomoz.org/q/pages-i-dont-want-customers-to-see http://www.robotstxt.org/meta.html Our site is http://www.fateyes.com Thanks so much for any assistance on this!
Moz Pro | | gfiedel0 -
Does Open Site Explorer violate Google's Terms of service?
According to Google's Webmaster Guidelines: "Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service." Does that mean Open Site Explorer is a violation of those Terms of Service, or is it authorized?
Moz Pro | | ericwagner0 -
Your opinion on this opportunity's difficulty?
I'm building a tool for mechanical engineers, and I'm trying to find 10 low-competition keywords to target in my first few content marketing efforts. I've got a lot of maneuvering room, so (with a bit of expert advice) I bet I'll be able to find some low-hanging fruit. Here's what I've found: Most keywords are seem to have about 40% difficulty. What's the highest level of SEOmoz "keyword difficulty" that a new website should reasonably try for? Some ranking high-authority pages are don't appear to be targeted at the term Is it fair to say that I could beat any page with less than a 'C' ranking for on-page optimization? (Assuming I target the term with general best practices)* Thanks! If you're interested, here is my current process: Go on engineering blogs for keywords Use Wordstream's Keyword Suggestion Tool for ideas around it Use Google Keyword Tools for keywords above 50 searches in direct match Use SEOmoz Keyword difficulty report, looking more deeply at keywords under <50% If I can find a top 10 page that's less than 30PA and less than 40DA, or has less than 'C' ranking for on-page optimization, I consider the keyword achievable within 3mo, using general best practices. *Except for YouTube/Wikipedia/etc
Moz Pro | | 49wetnoodles0 -
Does anyone know of a crawler similar to SEOmoz's RogerBot?
As you probably know SEOmoz had some hosting and server issues recently, and this came at a terrible time for me... We are in the middle of battling some duplicate content and crawl errors and need to get a fresh crawl of some sites to test things out before we are hit with the big one? Before I get a million thumbs downs- I love and will continue to use SEOmoz, just need something to get me through this week ( or until Roger is back! )!
Moz Pro | | AaronSchinke1 -
SEOMoz Pro still hasn't crawled 10k pages for one campaign
I looked for a question in the forum already for this but couldn't find anything. Perhaps I am using the wrong keywords, so I apologize if this is a duplicate. I recently signed up with SEOMoz Pro and added two campaigns. For one campaign, 10,000 pages were crawled. For the other campaign, only about 300. It's been 2 weeks since I created the campaigns. Is there a way to force a crawl of the site associated with the second campaign?
Moz Pro | | SharieBags0