Removing Content 301 vs 410 question
-
Hello,
I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website.
I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware).
Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere).
This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience.
When we cut pages, though, we used a different approach, doing all of the below steps:
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages.When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way…
I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda.
So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions:
1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)?2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did?
Thank you in advance for your help,
Eric -
Thanks Dr Peter! I agree with you! Just wanted to feel shure about it.
Yes, Gary, you can personalize also a 410 page.
-
You should be able to customize a 410 just like you do a 404. The problem is that most platforms don't do that, by default, so you get the old-school status code page. That should be configurable, though, on almost all modern platforms.
-
From a commerce perspective the biggest problem I have with the 410 is the user experience. If I tag a URL with a 410 when someone request the page they get a white page that says GONE. They never even get the chance to see the store and maybe search for a similar product.
Would it work if I built a landing page that returns a 410 and then used the 301 to redirect the bad URL to the landing page? It would make the customer happy, they would be in the store with a message to search for something else. But would Google really associate the 410 with the redirected URL?
-
Hi Sandra, don't worry about 404s volume because they won't hurt your rankings.
About your issue I understand that you want to be really clear with your users and don't hurt their experience on the site. So create a custom 404 which changes its content depending of what page is returning it. If it's one of your old product you can return a message or an article of why you decided to remove them and propose some alternatives. For all other errors you can just return a search box or related products to the one you lost.
301 IMHO are not the way to go, if an url is gone it has not being redirected anywhere, so a 301 will result in a bad UX 99% of the time.
-
Hello,
I have a related question about 301 vs 410.
I have a client who wants to delete a whole category of product from one site. It's a big amount of product, so a big amount of urls, but this product is not working very well. So the decision is not SEO-related but more as a business decision. It's not for Panda.
If we think about the communication with the user, the best option would be to have a landing page explaining that we decided to remove that product.
Then the question is, do we do a redirect 301 of all those urls to this landing page? I am afraid that a big redirect like this, going from many urls to a single one (even if this is not created to rank on google) can be seen dodgy by Google. Am I right?
Or do I do a 410 for those pages, and I personalize the 410 landing only for these urls in order to communicate with the user (is that even possible?). But I am afraid, because we'll have much 4XX Errors in WMT, and this may have influence to the rankings!
So I don't know what to do! It's a must that we delete this content and that we communicate it well with the users.
Thanks for your help,
-
100% agreed - 403 isn't really an appropriate alternative to 404. I know SEOs who claim that 410s are stronger/faster, but I haven't seen great evidence in the past couple of years. It's harmless to try 410s, but I wouldn't expect miracles.
-
Hi Eric, I'll try to answer your further question even if I'm not an oracle like Pete
First of all thanks Pete to underline that you need to give google just one response since you can't give them both 301 and 404, I was assuming that and I didn't focus on that part of Eric's answer.
Second. Eric, If your purpose is to give google the ability of recrawl the old content to let them see it has disappeared you want to give them a 404 or a 410 which are respectively not found and permanently not found. Before it was a difference but now they've almost the same value under google's eyes (further reading). In that way google can access your page and see that those contents are now gone.
In the case of 403 the access is denied to anyone both google and humans, so in that case google won't be able to access and recrawl it. If your theory is based (and I think you're in the good way) upon the thing that google needs to recrawl your content and see it ahs really gone, 403 is not the response you should give it.
-
Hey there mememax - thank you for the reply! Reading your post and thinking back to our methodology, yes I think in hindsight we were a bit too afraid about generating errors when we removed content - we should have considered the underlying meaning of the different statuses more carefully. I appreciate your advice.
Eric
-
Hello Dr. Pete – thank you for the great info and advice!
I do have one follow-up question if that's ok – as we move forward cutting undesirable content and generate 4xx status for those pages, is there a difference in impact/effectiveness between a 403 and a 404? We use a CMS and un-publishing a page creates a 403 “Access denied” message. Deleting a page will generate a 404. I would love to hear your opinion about any practical differences from a Googlebot standpoint… does a 404 carry more weight when it comes to content removal, or are they the same to Googlebot? If there’s a difference and the 404 is better, we’ll go the 404 route moving forward.
Thanks again for all your help,
Eric
-
Let me jump in and clarify one small detail. If you delete a page, which would naturally result in a 404, but then 301-redirect that page/URL, there is no 404. I understand the confusion, but ultimately you can only have one HTTP status code. So, if the page properly 301s, it will never return a 404, even if it's technically deleted.
If the page 301s to a page that looks like a "not found" sort of page (content-wise), Google could consider that a "soft 404". Typically, though, once the 301 is in place, the 404 is moot.
For any change in status, the removal of crawl paths could slow Google re-processing those pages. Even if you delete a page, Google has to re-crawl it to see the 404. Now, if it's a high-authority page or has inbound (external) links, it could get re-crawled even if you cut the internal links. If it's a deep, low-value page, though, it may take Google a long time to get back and see those new signals. So, sometimes we recommend keeping the paths open.
There are other ways to kick Google to re-crawl, such as having an XML sitemap open with those pages in them (but removing the internal links). These signals aren't as powerful, but they can help the process along.
As to your specific questions:
(1) It's very tricky, in practice, especially at large-scale. I think step 1 is to dig into your index/cache (slice and dice with the site: operator) and see if Google has removed these pages. There are cases where massive 301s, etc. can look fishy to Google, but usually, once a page is gone, it's gone. If Google has redirected/removed these pages, and you're still penalized, then you may be fixing the wrong problem or possibly haven't gone far enough.
(2) It really depends on the issue. If you cut too deep and somehow cut off crawl paths or stranded inbound links, then you may need to re-establish some links/pages. If you 301'ed a lot of low-value content (and possibly bad links), you may actually need to cut some of those 301s and let those pages die off. I agree with @mememax that sometimes a helathy combination of 301s/404s is a better bet - pages go away, and 404s are normal if there's really no good alternative to the page that's gone.
-
Hi Eric, in my experience I've always found 4** better than 301 to solve this kind of issues.
Many people uses this response too much just because they want to show google that their site don't have any 404.
Just think about it a little, a 301 is a permanent redirect, a content which has just moved from one place to another. If you got a content you want to get rid of, do you want to give google the message "hey that low quality content is not where you found it but now it's here", no. You wan't to give google the message that the low quality content has been improved or removed. And a 404 is the right message to give him if you deleted that content.
It's prefectly normal to have 404s in a website, many 404 won't hurt your rankings, only if those pages were ranking already so users will receive a 404 instead and if some external sites were linking there in that case you may consider a 301.
While I think that google has a sort of a black list (and a white list too) I don't think that it has a memory of bad sites he encounters, if you fix your issues you'll start to rank again.
The issue you may have is not that you're site may be tainted but that maybe you still have some issues here and there which you didn't fix. As it seems Googlers said that Panda is now part of the algo so if you fix your issues you won't need any upgrade to start re ranking.
Hope this may have helped!! G luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
301 Directs
We have found a lot of 404 error pages that we have transferred with 301 directs. My questions is, should these 301 directs be marked as a NF (nofollow)?
Intermediate & Advanced SEO | | Essential-Pest0 -
Question about moving content from one site to another without a 301
I could use a second opinion about moving content from some inactive sites to my main site. Once upon a time, we had a handful of geotargeted websites set up targeting various cities that we serve. This was in addition to our main site, which was mostly targeted to our primary office and ranked great for those keywords. Our main site has plenty of authority, has been around for ages, etc. We built out these geo-targeted sites with some good landing pages and kept them active with regularly scheduled blog posts which were unique and either interesting or helpful. Although we had a little success with these, we eventually saw the light and realized that our main site was strong enough to rank for these cities as well, which made life a whole lot easier, not to mention a lot less spammy. We've got some good content on these other sites that I'd like to use on our main site, especially the blog posts. Now that I've got it through my head that there's no such thing as a duplicate content penalty, I understand that I could just start moving this content over so long as I put a 301 redirect in place where the content used to be on these old sites. Which leads me to my question. Our SEO was careful not to have these other websites pointing to our main site to avoid looking like we were trying to do something shady from a link building perspective. His concern is that these redirects would undermine that effort and having a bunch of redirects from a half dozen sites could end up hurting us somehow. Do you think that is the case? What he is suggesting we do is remove all of the content that we'd like to use and use Webmaster Tools to request that this content be removed from the index. Then, after the sites have been recrawled, we'll check for ourselves to confirm they've been removed and proceed with using the content however we'd like. Thoughts?
Intermediate & Advanced SEO | | LeeAbrahamson0 -
Permalink question
For 5 years I have used the permalink custom structure: /%postname% without the end backslash. I didn't think the difference was that big of a deal, yet last month I was curious of what benefits would happen if I made the change. To my surprise my rankings took a slight dive, but recovered stronger than before. As the URL itself doesn't require a redirect the posts and pages loaded the same with or wothout the "/" But now in Open Site Explorer, all my URL's have no page Authority. All the links i built were pointing to links without the backslash: example.com/post-name Questions: Did Google figure out the change, hence the dip in rankings and strong return? Will keeping /%postname%/ even though many links are pointing to a non backslash URL comeback to haunt me? Is there anything I can do to help lead Google to better see the changes I've made? thx
Intermediate & Advanced SEO | | MikePatch0 -
Complicated Question: Removing Spam Backlinks that were Not Requested
I'm new and seeking help with the following scenario: 1. Main site: is a domain.com established authority type site 2. Second site: is a domain.org (has robots.txt to no index) but someone obviously not site owner has done negative seo campaign against the .org domain and built spammy links to it. In fact, that's all that exist on this second domain because it is used for development purposes only right now.) No one would link to this one normally as it is just secondary domain used to protect trademark and for development use.) When searching for it by domain name it does not appear on first page for search results. Checking link profile the only links that show for domain.org are spam links. Have contacted site/s where spam links were placed (no answer) Main site domain.com and domain.org have same whois and hosted on the same server as they are owned by same company Main site domain.com still appears first for its name but has lost some rankings. I am working to fix some technical issues ie: duplicate urls with CMS etc, but would like to find out what to do about the domain.org content that clearly has had someone target it with spammy non requested backlinks.) domain.com has Google webmaster tools account, no messages about unnatural liking in those reports 1. I'm not sure I should add domain.org to GWT to see if there is an unnatural link penalty applied or if this would further connect the two domains through association. If I could get some feedback/suggestions on what my options are with regards to making sure that the domain.org domain has a clean profile that would be most appreciated. Also because site owner has would like to begin using domain.org in the future for some unique content, but as it stands right now cannot because domain has been targed by poor backlinks. Anyone else run into situation where the .org or .net versions were targeted by spammy backlinks even though the domains were not actively used? What's the safest way to proceed? a) Concerned about possible co-penalty between main site domain.com and domain.org b) how to remove problems issues with domain.org so that owner can use it in future. Many thanks for your thoughts and help with this one. I appreciate any help or feedback.
Intermediate & Advanced SEO | | web0230 -
Press Release Question.
We do many Press Releases for many clients and it is recommended to have an about us at the bottom of each Press Release... If we have 200 words at the bottom of each Press Release that we submit to PRWeb should we change them up so they look unique? Will this be considered duplicate content if we leave the bottom section of our PR's identical?
Intermediate & Advanced SEO | | SEODinosaur0 -
Canonical URL Question
Hi Everyone I like to run this question by the community and get a second opinion on best practices for an issue that I ran into. I got two pages, Page A is the original page and Page B is the page with duplicate content. We already added** ="Page A**" />** to the duplicate content (Page B).** **Here is my question, since Page B is duplicate content and there is a link rel="canonical" added to it, would you put in the time to add meta tags and optimize the title of the page? Thanks in advance for all your help.**
Intermediate & Advanced SEO | | DRTBA0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1