Better to 301 or de-index 403 pages
-
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index.
At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
-
Sounds solid. Thanks, Dirk!
-
The main reason why errors are listed is that you can solve them (if necessary). If these are old pages that don't have existing links on your pages - you can just forget about these warnings. However, if these warnings appear because actual pages are linking to non-existing pages this will lead to a degraded user experience and user experience is a factor which counts for SEO.
If you look at the 403 errors - normally WMT lists how the bot got to these pages. If the pages that are linking to this 403 pages are still on your site, you have to remove these links.
If you have dropped in traffic, you could try to do a full crawl of your site using screaming frog of Xenu, to do a quick check-up of the technical health of your site.
If you still have an old sitemap, or the most popular pages in Google Analytics from the period before migration, you could also use these url's as input for Screamingfrog - and check if all pages were properly redirected. If errors pop-up, these would be the ones I would redirect. I understood from your initial question that the 403's where coming from very old pages which were never meant to be accessible.
rgds
Dirk
-
Hi Dirk,
Thanks for the message. You may be right. Thing is, GWT's discovery of this large number of now blocked pages (previously indexed) seems to have coincided with a big drop in search overall.
I guess the part that I wonder about it is, if these now blocked pages as 403s are no problem and Google will just figure it out, why does it bother to list them in errors... just in case you didn't know, but that it doesn't in fact care one way or the other search-wise and it won't affect your other pages? Just wondering. Thanks... Darcy
-
It's not really necessary to 301 these pages - a 403 status code informs Google that the access is denied (Literally: The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.)
Normally these pages will disappear from WMT after a while. If you find these 403 annoying in your WMT reports, you can always 301 them - but this isn't strictly necessary.
Removal tool - Google's advice is not to use the tool "to clean up cruft, like old pages that 404" (source: https://support.google.com/webmasters/answer/1269119?hl=en).
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
.Com version of my site is ranking better than .co.uk for my UK Website for branded search. 301 redirect mess
Dear Mozzers, I have an issue with my UK Website (short url is - http://goo.gl/dJ7IgD ) whereby when I type my company name in to google.co.uk search the .com version returns in Search as opposed to the .co.uk and from looking at open site explorer the page rank of the .com is higher than the .co.uk ?. Infact I cant even see the .co.uk homepage version but other pages from my site. The .com version is also 301'd to the .co.uk. From looking at Open Site Explorer, I have noticed that we have more links pointing to .com as opposed to .co.uk. Alot of these are from our own separate microsites which we closed down last year and I have noticed the IT company who closed them down for some reason 301'd them to the .com version of our site as opposed to the .co.uk but If I look in http://httpstatus.io/ (http status checker tool) to check one of these mircosites it shows - 301 - 302 - 200 status codes which to me looks wrong ?. I am wondering what it should read ... e.g should it just be a 301 to a 200 status code ?. My Website short url is - http://goo.gl/dJ7IgD and an example of some of 10 microsites we closed down last year which seems to be redirected to .com is http://goo.gl/BkcIjy and http://goo.gl/kogJ02 As these were redirected almost a year ago - it is okay if I now get them redirected to the .co.uk version of my site or what should I do ? They currently redirect to the home page but given that each of the microsites are based on an individual category of my main site , would it be better to 301 them to the relevant category on my site. My only concern is that , may cause to much internal linking and therefore I wont have enough links on my homepage ? How would you suggest I go about building up my .co.uk authority so it ranks betters than the .com- I am guessing this is obviously affecting my rankings and I am losing link juice with all this. Any advice greatly appreciated . thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Page indexed but not showing up at all in search results
I am currently working on the SEO for a roofing company. I have developed GEO targeted pages for both commercial and residential roofing (as well as attic insulation and gutters) and have hundreds of 1st page placements for the GEO targeted keywords. What is baffling me is that they are performing EXTREMELY poorly on the bigger cities, to the point of not evening showing up in the first 5 pages. I also target a page specifically for roof repair in Phoenix and it is not coming up AT ALL. This is not typically the results I get when directly targeting keywords. I'm working on implementing keyword variations as well as adding about 10 or so information pages (@ 700 words) regarding different roofing systems which I plan to cross link on the site, etc. I'm just wondering if there is a simple answer as to why the pages I want to be showing up the most are performing so poorly and what I would need to do to improve their rankings.
Intermediate & Advanced SEO | | dogstarweb0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
Should you replace the url on a damaged page and 301 to it ?
Hi, We have a couple of pages which have been damaged due to an SEO person we hired creating a stupid amount of bookmarks and generally poor links. I've tried to get the links removed where I can but on most of these blogging sites there is no contact webmaster etc so I am struggling. Panda update as also affected traffic by about 35%. My question is , should I consider creating new urls for the "damaged " pages and then doing 301 redirects to them from the damaged page to the new page. Then start to build up good links to the new page whilst google should de-index the old pages over a couple of months ?. Just at my witts end how to get rid of these blogging rubbish etc etc. Thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0