Better to 301 or de-index 403 pages
-
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index.
At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
-
Sounds solid. Thanks, Dirk!
-
The main reason why errors are listed is that you can solve them (if necessary). If these are old pages that don't have existing links on your pages - you can just forget about these warnings. However, if these warnings appear because actual pages are linking to non-existing pages this will lead to a degraded user experience and user experience is a factor which counts for SEO.
If you look at the 403 errors - normally WMT lists how the bot got to these pages. If the pages that are linking to this 403 pages are still on your site, you have to remove these links.
If you have dropped in traffic, you could try to do a full crawl of your site using screaming frog of Xenu, to do a quick check-up of the technical health of your site.
If you still have an old sitemap, or the most popular pages in Google Analytics from the period before migration, you could also use these url's as input for Screamingfrog - and check if all pages were properly redirected. If errors pop-up, these would be the ones I would redirect. I understood from your initial question that the 403's where coming from very old pages which were never meant to be accessible.
rgds
Dirk
-
Hi Dirk,
Thanks for the message. You may be right. Thing is, GWT's discovery of this large number of now blocked pages (previously indexed) seems to have coincided with a big drop in search overall.
I guess the part that I wonder about it is, if these now blocked pages as 403s are no problem and Google will just figure it out, why does it bother to list them in errors... just in case you didn't know, but that it doesn't in fact care one way or the other search-wise and it won't affect your other pages? Just wondering. Thanks... Darcy
-
It's not really necessary to 301 these pages - a 403 status code informs Google that the access is denied (Literally: The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.)
Normally these pages will disappear from WMT after a while. If you find these 403 annoying in your WMT reports, you can always 301 them - but this isn't strictly necessary.
Removal tool - Google's advice is not to use the tool "to clean up cruft, like old pages that 404" (source: https://support.google.com/webmasters/answer/1269119?hl=en).
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
Certain Pages Not Being Indexed - Please Help
We are having trouble getting a bulk of our pages indexed in google. Any help would be greatly appreciated! The Following Page types are being indexed through escaped fragment: http://www.cbuy.tv/#! http://www.cbuy.tv/celebrity#!65-Ashley-Tisdale/fashion/4097-Casadei-BLADE-PUMP/Product/175199 <cite>www.cbuy.tv/celebrity/155-Sophia-Bush#!</cite> However, all our pages that look like this, are not being indexed: http://www.cbuy.tv/#!Type=Photo&id=b1d18759-5e52-4a1c-9491-6fb3cb9d4b95&Katie-Holmes-Hot-Pink-Pants-Isabel-Marant-DAVID-DOUBLE-BREASTED-Wool-COAT-Maison-Pumps-Black-Bag
Intermediate & Advanced SEO | | CBuy0 -
Home Page Got Indexed as httpS and Rankings Went Down.
Hello fellow SEO's About 3 weeks ago all of a sudden the home page on our Magento based website went down in rankings (from top 10 to page 3-4 Google) and was showing as httpS - instead of usual http. It first happened with just a few keywords and a week later any search phrase was returning the httpS result for the home page. When I view cache for the home page now it (both http and httpS versions) it gives me this http://clip2net.com/s/2OtPS We are not blocking anything in robots.txt Robots tags are set to index,follow There are hardly any external links pointing at the home pages as httpS This only affected the home page - all other pages rank where they used to and appear as http Has anybody ever had a similar problem? Thanks in advance for your thoughts and help
Intermediate & Advanced SEO | | ddseo0 -
Reverse Proxy better than 301 redirect?
Are reverse proxies that much better than 301 redirects? Should I invest the time in doing this? I found out about reverse proxies here: http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo
Intermediate & Advanced SEO | | brianmcc0 -
Will Google Revisit a 403 Page
Hi, We've got some pretty strict anti-scraping logic in our website, and it seems we accidentally snared a Googlebot with it. About 100 URL requests were responded to with a 403 Forbidden error. The logic has since been updated, so this should not happen again. I was just wondering if/when Googlebot will come back and try those URLs again. They are linked from other pages on the site, and they are also in our sitemap. Thanks in advance for any assistance.
Intermediate & Advanced SEO | | dbuckles0