Send noindex, noarchive with 410?
-
My classifieds site returns a 410 along with an X-Robots-Tag HTTP header set to "noindex,noarchive" for vehicles that are no longer for sale. Google, however, apparently refuses to drop these vehicles from their index (at least as reported in GWT). By returning a "noindex,noarchive" directive, am I effectively telling the bots "yeah, this is a 410 but don't record the fact that this is a 410", thus effectively canceling out the intended effect of the 410?
-
That sounds good, let me know if you have further questions, I'm always glad to be of help!
-
Thanks for the info, mememax. I don't relish the thought of using the removal tool, but I suppose I can actually 301-redirect many of those 410s to category pages and then use the GWT for the rest.
-
hey Tony you made it in the right way, you added the error code + the noindex. However google won't drop your page from the index until it crawls it several times.
You can do this: first of all be sure that you have no links pointing to that page then:
- see in GWT if the page is showing as a 404 and when it will disappear from GWTools errors
- or go to GWT and ask google to remove it from the index. This is the fastest way, and google asks you to add a noindex or return a 404 to make this action, so actually you're more than fine to do that, however it depends on the volume of 404s you have this may be a huge and repetitive task to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
Many "spin-off" sites - 301 or 401/410?
Hi there, I've just started a new job with a rental car company with locations all over New Zealand and Australia. I've discovered that we have several websites along the lines of "rentalcarsnewzealand", "bigsaverentals" etc that are all essentially clones of our primary site. I'm assuming that these were set up as some sort of "interesting" SEO attempt. I want to get rid of them, as they create customer experience issues and they're not getting a hell of a lot of traffic (or driving bookings) anyway. I was going to just 301 them all to our homepage - is this the right approach? Several of the sites are indexed by Google and they've been linked up to a number of sites - the 301 move wouldn't be to try to derive any linkjuice or anything of that nature, but simply to get people to our main site if they do find themselves clicking a link to one of those sites. Thanks very much for your advice! Nicole
Technical SEO | | AceRentalCars0 -
Noindex Success?
Has anyone had success implementing noindex/follow to pages from their site which has been hit by a Panda penalty? Our site has a lot of duplicate content for products descriptions that we had permission to use from our distributor (who is also online). We went ahead and noindex/follow those pages in the hopes that google will focus on the products that we carry that do have original descriptions (about 1/3 of our products). We didn't want to just remove those products since they are actually beneficial to our customers. Most of the duplication of content is in the form of ingredients lists.
Technical SEO | | dustyabe0 -
Page has Noindex, nofollow, still ranks #1
Hi there, I have a question about a few pages on our site, whom has a no index, nofollow meta tag but they are still indexed and even rank number one in our market for the term. How is that possible or is it that Google just ignores the tags when they think it´s an error from our side? The url is www.drogisterij.net/kilo_killer and the keyword is kilo killer. We rank number 1 if you search from Google.nl Anyone have seen it before and know why this might be? Thanks in advance.
Technical SEO | | JaapWillemDrogisterij0 -
Why use noindex, follow vs rel next/prev
Look at what www.shutterstock.com/cat-26p3-Abstract.html does with their search results page 3 for 'Abstract' - same for page 2-N in the paginated series. | name="robots" content="NOINDEX, FOLLOW"> |
Technical SEO | | jrjames83
| | Why is this a better alternative then using the next/prev, per Google's official statement on pagination? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744 Which doesn't even mention this as an option. Any ideas? Does this improve the odds of the first page in the paginated series ranking for the target term? There can't be a 'view all page' because there are simply too many items. Jeff0 -
Hotel affiliate website - noindex pages with little unique content?
We are well into development of a hotel affiliate website (using Expedia Affiliate Network), and I know there are many challenges to SEO when using an affiliate system - one of the biggest being how to handle duplicate content. Outside of blog posts and static marketing pages, the majority of the textual content is contained in hotel descriptions. We will be creating unique descriptions over time, but we are a small team and this will be a lengthy process. My question for you mozzers, is whether or not it's advisable for ranking purposes to noindex any page with mostly 'stock' content, and only allow Google to index hotel pages with unique descriptions? Thanks for any input!
Technical SEO | | CassisGroup0 -
Feedburner - Why Sending My Blog Posts A Day After I Post Them?
I have my feed setup through feedburner for my wife's blog ktlouise.com. Whenever she posts a new blog post, it doesn't get emailed to her subscribers until the next day. Does anyone know how to change this so that the updates go out the same day? Thanks for the help! REF
Technical SEO | | FergusonSEO0 -
Why does SEOMos Pro include noindex pages?
I'm new to SEOMoz. Been digesting the crawl data and have a tonne of action items that we'll be executing on fairly soon. Love it! One thing I noticed is in some of crawl warnings include pages that expressly have the ROBOTS meta tag with the "noindex" value. Example: many of my noindex pages don't include meta descriptions. Therefore, is it safe to ignore warnings of this nature for these pages?
Technical SEO | | ChatterBlock0