Error 403
-
Hi SEOmoz community,
Today, I checked the google webmaster tool of one of my clients, and ithere are 18 403 errors, I was wondering on how to fix those since it is the first time I come across these errors? How can I avoid that in the future?
Thank you,
-
Those are "access denied" errors. In other words, you are displaying links on your site that can only be accessed if you're logged in. Don't worry, this isn't a really big deal.
To fix it you would need to use a server side language, like php or asp, to check if a user is logged in before outputting the links. One really elegant way to do this is to change the destination of the link depending on if the user is logged in or not. If they're not logged in, send them to a registration page telling them the benefits of signing up. This could really boost your conversion rates. Of course, if the link is to an admin section this wouldn't apply.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
What is the best way to correct 403 access denied errors?
One of the domains I manage is seeing a growing number of 403 errors. For SEO purposes would it be ideal to just 301 redirect them? I am plenty familiar with 404 error issues, but not 403s.
Intermediate & Advanced SEO | | RosemaryB0 -
Rankings gone, no WMT errors, help!
Hi, Client Google rankings have been seriously hit. We have done everything we know of to see why this is the case, and there is no obvious explanation. The client dominated search terms, and are no down on page 7/8 for these search terms. There are no errors in WMT, so we can not resubmit for reconsideration. This is a genuine client and their business has been seriously affected. Can anybody offer help? Thanks in advance!
Intermediate & Advanced SEO | | roadjan0 -
Duplicate Errors from Wordpress login redirects
I've some Duplicate issues showing up in Moz Analytics which are due to a Q&A plugin being used on a Wordpress website which prompts the user to login. There's a number of links looking like the one shown below, which lead to the login page: www.website.com/wp-login.php?redirect_to=http%3A%2F%2Fwww.website.com%question%2.... What's the best way to deal with this? -- extra info: this is only showing up in Moz Analytics. Google Webmaster Tools reports no duplicates.. I'm guessing this is maybe down to the 'redirect_to' parameter being effective in grouping the URLs for Googlebot. currently the wplogin and consequent redirects are 'noindex, follow' - I cannot see where this is being generated from in wp-login.php to change this to nofollow (if this will solve it).
Intermediate & Advanced SEO | | GregDixson0 -
Robots.txt error message in Google Webmaster from a later date than the page was cached, how is that?
I have error messages in Google Webmaster that state that Googlebot encountered errors while attempting to access the robots.txt. The last date that this was reported was on December 25, 2012 (Merry Christmas), but the last cache date was November 16, 2012 (http://webcache.googleusercontent.com/search?q=cache%3Awww.etundra.com/robots.txt&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a). How could I get this error if the page hasn't been cached since November 16, 2012?
Intermediate & Advanced SEO | | eTundra0 -
How do I best deal with pages returning 404 errors as they contain links from other sites?
I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA. It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem. Could I request manual removal or something of this nature? Thoughts appreciated.
Intermediate & Advanced SEO | | Towelsrus0 -
External links point to 403 page - how to 301 redirect if no file extension?
Hi guys, After moving from an old static .htm site to Wordpress, I 301'd all old .htm urls fine to the new trailing slash foldery style /wordpress-urls/ in htaccess no problem. But Google Webmaster Tools tells me I still have hundreds of external links pointing to a similar version of the old urls (but without the .htm), giving lots of not founds and 403s. Example of the urls linked to that 403 not found: http://www.mydomain.com/filename So I'm wondering how I do a 301 redirect from a non-exisiting url that also has no file extention as above and is not like a folder? This seems like a lot of possible external link juice to lose. Thanks!
Intermediate & Advanced SEO | | emerald0 -
Rich Snippets Publisher errors
Hi all. Happen to do a bit of testing with some of our microformat and microdata markup when I noticed our linked Google+ Publisher markup has stopped working. It definitely was working, and nothings changed, but now we are flagging errors, and I've noticed some of our competitors also have the same problem. publisher linked Google+ page = https://plus.google.com/103929635387487847550
Intermediate & Advanced SEO | | sjr4x4
Error: This page does not include verified publisher markup. Learn more. If I actually add a duplicate rel="publisher" then I get the following results: Extracted Author/Publisher for this page publisherlinked Google+ page = https://plus.google.com/103929635387487847550
Error: This page does not include verified publisher markup. Learn more. publisherlinked Google+ page = https://plus.google.com/103929635387487847550/ The second line doesn't seem to flag an error? I know this is still all pretty new, so is anyone else having problems or odd results, or is Google having some problems? All our other rich snippets such as reviews etc are working fine, just seems to be the publisher bit. cheers Steve0