Error 403
-
Hi SEOmoz community,
Today, I checked the google webmaster tool of one of my clients, and ithere are 18 403 errors, I was wondering on how to fix those since it is the first time I come across these errors? How can I avoid that in the future?
Thank you,
-
Those are "access denied" errors. In other words, you are displaying links on your site that can only be accessed if you're logged in. Don't worry, this isn't a really big deal.
To fix it you would need to use a server side language, like php or asp, to check if a user is logged in before outputting the links. One really elegant way to do this is to change the destination of the link depending on if the user is logged in or not. If they're not logged in, send them to a registration page telling them the benefits of signing up. This could really boost your conversion rates. Of course, if the link is to an admin section this wouldn't apply.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wordpress Tags error in MOZ
Hi, We are getting an enormous amount of missing meta description only on our tag archives. When we post we fill in a description and are using the Yoast plugin ( getting green lights). Now we are finding that we're missing descriptions in tags archives. What is the best thing to do? We're finding that the tags are creating a separate url for each tag that has missing description even though the post has a full description. 1. To block spiders from crawling tags? 2. To stop using tags? 3. What do you suggest? Thank you
Intermediate & Advanced SEO | | WalterHalicki0 -
Thousands of 503 errors in GSC for pages not important to organic search - Is this a problem?
Hi, folks A client of mine now has roughly 30 000 503-errors (found in the crawl error section of GSC). This is mostly pages with limited offers and deals. The 503 error seems to occur when the offers expire, and when the page is of no use anymore. These pages are not important for organic search, but gets traffic from direct and newsletters, mostly. My question:
Intermediate & Advanced SEO | | Inevo
Does having a high number of 503 pages reported in GSC constitute a problem in terms of organic ranking for the domain and the category and product pages (the pages that I want to rank for organically)? If it does, what is the best course of action to mitigate the problem? Looking excitingly forward to your answers to this 🙂 Sigurd0 -
URL Errors in webmaster tools to pages that don't exist?
Hello, for sometime now we have URLs showing up in Google webmaster saying these are 404 errors but don't exist on our website.......but also never have? Heres an example cosmetic-dentistry/28yearold-southport-dentist-wins-best-young-dentist-award/801530293 The root being this goo.gl/vi4N4F Really confused about this? We have recently made our website wordpress? Thanks Ade
Intermediate & Advanced SEO | | popcreativeltd0 -
VisitSweden indexing error
Hi all Just got a new site up about weekend travel for VisitSweden, the official tourism office of Sweden. Everything went just fine except som issues with indexing. The site can be found here at weekend.visitsweden.com/no/ For some weird reason the "frontpage" of the site does not get indexed. What I have done myself to find the issue: Added sitemaps.xml Configured and added site to webmaster tools Checked 301s so they are not faulty By doing a simple site:weekend.visitsweden.com/no/ you can see that the frontpage is simple not in the index. Also by doing a cache:weekend.visitsweden.com/no/ I see that Google tries to index the page without the trailing /no/ for some reason. http://webcache.googleusercontent.com/search?q=cache:http://weekend.visitsweden.com/no/ Any smart ideas to get this fixed or where to start looking? All help greatly appreciated Kind regards Fredrik
Intermediate & Advanced SEO | | Resultify0 -
How do I get rid of my errors for Schema.org?
I put the Schema.org data on my item pages and it works great. However, when an item closes it removes the price. It showed an empty price and that causes an error. The site is now programmed to where if an item closes it removes the price component. This was done about 2 weeks ago and it is still showing a lot of errors. Any ideas?
Intermediate & Advanced SEO | | EcommerceSite0 -
What do do when sidebar is causing "Too Many On-Page Links" error
I have been going through all the errors, warnings from my weekly SEO Moz scans. One thing I'm see a bit of is "Too Many On-Page Links". I've only seen a few, but as in the case of this one: http://blog.mexpro.com/5-kid-friendly-cancun-mexico-resorts there is only 2 links on the page (the image and the read more). So I think the sidebar links are causing the error. I feel my tags are important to help readers find information they may be looking for. Is there a better method to present tags than the wordpress tag cloud? Should I exclude the tags, with the risk of making things more difficult for my users? Thanks for your help.
Intermediate & Advanced SEO | | RoxBrock0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0