Benefit of using 410 gone over 404 ??
-
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone.
Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change.
I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case?
Or, when would you use a 410 gone?
Thanks
-
I had the (mis)fortune of trying to deindex nearly 2 million URLs across a couple of domains recently, so had plenty of time to play with this.
Like CleverPhD I was not able to measure any real difference in the time it took to remove a page that had been 410'd vs one that had been 404'd.
The biggest factor governing the removal of the URLs was getting all the pages recrawled. Don't underestimate how long that can take. We ended up creating crawlable routes back to that content to help Google keep visiting those pages and updating the results.
-
The 410 is supposed to be more definitive
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
404 is "not found" vs 410 is "gone
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
10.4.11 410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.
That said, I had a similar issue on a site with a couple thousand pages and went with the 410, not sure it really made things disappear any faster than the 404 (that I noticed).
I just found a post from John Mueller from Google
https://productforums.google.com/forum/#!topic/webmasters/qv49s4mTwNM/discussion
"In the meantime, we do treat 410s slightly differently than 404s. In particular, when we see a 404 HTTP result code, we'll want to confirm that before dropping the URL out of our search results. Using a 410 HTTP result code can help to speed that up. In practice, the time difference is just a matter of a few days, so it's not critical to return a 410 HTTP result code for URLs that are permanently removed from your website, returning a 404 is fine for that. "
So, use the 410 as a matter of a few days you may see a difference with 30k pages.
All of that said, are you sure with a site that big you would not need to 301 some of those pages. If you have a bunch of old news items or blog posts, would you not want to redirect them to the new URLs for those same assets? Seems like you should be able to recover some of them - at least your top traffic pages etc.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it. any advice would be much appreciated!
White Hat / Black Hat SEO | | jfishe19881 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
I've purchased a PR 6 domain what will be best use of it ?
I've purchased a PR 6 domain what will be best use of it ? Should make a new site or redirect it to my low pr sites? Or I wasted my $100 ?
White Hat / Black Hat SEO | | IndiaFPS0 -
Has anyone used tribepro.com
Does that concept really work. Any experience? I've registered and so far I think it's hard to measure whether the shares are spam or genuine. Would love to see it works for someoneThanks
White Hat / Black Hat SEO | | LauraHT0 -
Hidden links in badges using javascript?
I have been looking at a strategy used by a division of Tripadvisor called Flipkey. They specialize in vacation home rentals and have been zooming up in the rankings over the past few months. One of the main off-page tactics that they have been using is providing a badge to property managers to display on their site which links back. The issue I have is that it seem to me that they are hiding a link which has keyword specific anchor text by using javascript. The site I'm looking at offers vacation rentals in Tamarindo (Costa Rica). http://www.mariasabatorentals.com/ Scroll down and you'll see a Reviews badge which shows reviews and a link back to the managers profile on Flipkey. **However, **when you look at the source code for the badge, this is what I see: Find Tamarindo Vacation Rentals on FlipKey Notice that there is a link for "tamarindo vacation rentals" in the code which only appears when JS is turned off in the browser. I am relatively new to SEO so to me this looks like a black hat tactic. But because this is Tripadvisor, I have to think that that I am wrong. Is this tactic allowed by Google since the anchor text is highly relevant to the content? And can they justify this on the basis that they are servicing users with JS turned off? I would love to hear from folks in the Moz community on this. Certainly I don't want to implement a similar strategy only to find out later that Google will view it as cloaking. Sure seems to be driving results for Flipkey! Thanks all. For the record, the Moz community is awesome. (Can't wait to start contributing once I actually know what I'm doing!)
White Hat / Black Hat SEO | | mario330 -
Any Benefit to Artificially Boosting the CTR for rank?
I've read articles that indicates Google will provide a higher rank to listing with higher click through rates (i.e, http://bit.ly/132mUd0, "If a search result achieves a higher than average click through rate then it may be given a higher ranking.") First, this seems like a chicken-and-egg scenario: it seems like results with higher rank will have higher CTR from increased exposure, no? Second, if this was an accurate ranking signal, it seems like it would be so easy to black hat (as well as other web usage signals, such as goal conversions and time on site). I'd just pay some Indian dude to search for my website on different IPs and click through in the SERP. Your thoughts about this scenario?
White Hat / Black Hat SEO | | ExploreConsulting0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Ditching of spammy links - will it be of benefit?
Hi there. We have recently taken over the SEO for a five-star hotel who rank very well already for a lot of their main terms, largely down to the fact they have decent off-site strength (as yet very little on-page optimisation has been done, so they aren't appearing for some quite key terms). This off-page strength includes around 2000 links, giving the home page an authority of 63 in the OSE tool. However, upon looking at the links to check they were pointing to the most relevant page etc, I notice they have A LOT of spammy links, pointing to their site with anchor text like 'cheap cialis' or 'buy valium'. Clearly these aren't the kinds of links that should be pointing to a five-star hotel, but should I expect to see much of a drop by attempting to remove these links? We obviously want to clean their link portfolio up, but I'm not sure they would be too happy if all their top rankings disappeared - even if only temporarily, and even if done with the best intentions. I ask as none of the other sites we handle SEO for have had such a proliferation of these links, so I've not seen the ramifications in full. Any help would be much appreciated, along with advice on the best way to remove these links.
White Hat / Black Hat SEO | | themegroup0