Send noindex, noarchive with 410?
-
My classifieds site returns a 410 along with an X-Robots-Tag HTTP header set to "noindex,noarchive" for vehicles that are no longer for sale. Google, however, apparently refuses to drop these vehicles from their index (at least as reported in GWT). By returning a "noindex,noarchive" directive, am I effectively telling the bots "yeah, this is a 410 but don't record the fact that this is a 410", thus effectively canceling out the intended effect of the 410?
-
That sounds good, let me know if you have further questions, I'm always glad to be of help!
-
Thanks for the info, mememax. I don't relish the thought of using the removal tool, but I suppose I can actually 301-redirect many of those 410s to category pages and then use the GWT for the rest.
-
hey Tony you made it in the right way, you added the error code + the noindex. However google won't drop your page from the index until it crawls it several times.
You can do this: first of all be sure that you have no links pointing to that page then:
- see in GWT if the page is showing as a 404 and when it will disappear from GWTools errors
- or go to GWT and ask google to remove it from the index. This is the fastest way, and google asks you to add a noindex or return a 404 to make this action, so actually you're more than fine to do that, however it depends on the volume of 404s you have this may be a huge and repetitive task to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should i noindex/nofollow a faceted navigation page?
I have an ecommerce website with 4 departments, that share the same categories, For example a bicycle shop would have different products for mountain biking and road cycling, but they would both share the same 'tyres' category. I get around this by having the department as a filter, that changes the products on show, and adds a URL parameter of ?department=1. When this filter is applied, i have a canonical link setup to the non-filtered category. Any filter links are nofollowed. My top menu has 4 different sections, one for each department, and links to these URLs with the department parameter already on, these links are set to allow robots to follow. As i am actively pointing Google at these pages, and it is my main navigation, should the page they go to be noindexed? As its the canonical i want to rank. Hopefully this makes sense. Cheers
Technical SEO | | SEOhmygod0 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
Why are pages still showing in SERPs, despite being NOINDEXed for months?
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated.
Technical SEO | | MadeLoud0 -
Just noindexed and redirected junk from my site. Now what?
Hi to all! I've just finished 301 redirecting some pages and "noindex - follow" some other. I have to add a lot of canonicals yet, but this is a start. Now how I check the results? Should I wait a week or so to see if something improves? How long does it takes for Google to remove the pages I've just redirected and noindexed? My site is crawled every day (as all sites I guess). Thanks!
Technical SEO | | enriquef0 -
Thoughts about stub pages - 200 & noindex ok, or 404?
With large database/template driven websites it is often possible to get a lot of pages with no content on them. What are the current thoughts regarding these pages with no content, options; Return a 200 header code with noindex meta tag Return a 404 page & header code Something else? Thanks
Technical SEO | | slingshot0 -
How to noindex lots of content properly : bluntly or progressively ?
Hello Mozers ! I'm quite in doubt, so I thought why not ask for help ? Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise). Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how. There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position. The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ? Thanks a lot for your help ! Johann.
Technical SEO | | JohannCR0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0