Benefit of using 410 gone over 404 ??
-
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone.
Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change.
I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case?
Or, when would you use a 410 gone?
Thanks
-
I had the (mis)fortune of trying to deindex nearly 2 million URLs across a couple of domains recently, so had plenty of time to play with this.
Like CleverPhD I was not able to measure any real difference in the time it took to remove a page that had been 410'd vs one that had been 404'd.
The biggest factor governing the removal of the URLs was getting all the pages recrawled. Don't underestimate how long that can take. We ended up creating crawlable routes back to that content to help Google keep visiting those pages and updating the results.
-
The 410 is supposed to be more definitive
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
404 is "not found" vs 410 is "gone
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
10.4.11 410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.
That said, I had a similar issue on a site with a couple thousand pages and went with the 410, not sure it really made things disappear any faster than the 404 (that I noticed).
I just found a post from John Mueller from Google
https://productforums.google.com/forum/#!topic/webmasters/qv49s4mTwNM/discussion
"In the meantime, we do treat 410s slightly differently than 404s. In particular, when we see a 404 HTTP result code, we'll want to confirm that before dropping the URL out of our search results. Using a 410 HTTP result code can help to speed that up. In practice, the time difference is just a matter of a few days, so it's not critical to return a 410 HTTP result code for URLs that are permanently removed from your website, returning a 404 is fine for that. "
So, use the 410 as a matter of a few days you may see a difference with 30k pages.
All of that said, are you sure with a site that big you would not need to 301 some of those pages. If you have a bunch of old news items or blog posts, would you not want to redirect them to the new URLs for those same assets? Seems like you should be able to recover some of them - at least your top traffic pages etc.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will pillar posts create a duplication content issue, if we un-gate ebook/guides and use exact copy from blogs?
Hi there! With the rise of pillar posts, I have a question on the duplicate content issue it may present. If we are un-gating ebook/guides and using (at times) exact copy from our blog posts, will this harm our SEO efforts? This would go against the goal of our post and is mission-critical to understand before we implement pillar posts for our clients.
White Hat / Black Hat SEO | | Olivia9540 -
Is RSS feed syndication an effective link building strategy? Has anyone used it and had success?
This process was recommended to us and I am having trouble understanding exactly how it works. Does this type of link building directly benefit your site or is it an indirect process? Also, can you be penalized for republishing someone's content on your feed?
White Hat / Black Hat SEO | | marketingdepartment.ch0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0 -
Using an auto directory submission
Has anyone used easysubmits.com and what's your experience with it? Any other directory submission or link building tools that help automate and manage the process like easysubmits.com says they can do? I'm just looking at it currenlty and wanted to hear others thoughts before I get taken in by some black hat method that hurts my websites instead.
White Hat / Black Hat SEO | | Twinbytes0 -
How Does a Spammer Benefit from Referrer Spoofing?
A recent review of all client and personal websites showed a strange correlation. Traffic was coming in from Pelotas Brazil. The real clue was that some of my personal sites shouldn't see any traffic at all. So when I started getting traffic from Brazil I did some further research. It turns out Google is well aware of these events as seen in this forum post. Referral spam - one week online and www.web.com is using my google site Further reading lead me to this article: Stop Google Analytics Referrer Spam My question is: Why? Since the articles state referrer spam doesn't negatively affect search engine ranking then why bother in the first place? How is this spammer gaining anything by doing this? After reading both the Google Support Forum and the related article I'm still scratching my head trying to understand the method for this madness.
White Hat / Black Hat SEO | | JonathanGoodman0 -
How can I make use of multiple domains to aid my SEO efforts?
About an year, the business I work for purchased 20+ domains: sendmoneyfromcanada.com sendmoneyfromaustralia.com sendmoneyfromtheuk.com sendmoneyfromireland.com The list goes on, but you can get the main idea. They thought that the domains can be useful to aid http://www.transfermate.com/ . I can set up a few micro sites on them, but from that point there will be no one to maintain them. And I'm, honestly, not too happy with hosting multiple sites on one IP and having them all link to the flagship. It is spammy and it does not bring any value to end users. I might be missing something, so my question is - Can I use these domains to boost my rankings, while avoiding any shady/spammy techniques? P.S. I had this Idea of auctioning the domains in order to cover for the domain registration fees.
White Hat / Black Hat SEO | | Svetoslav0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11