Removing Content 301 vs 410 question
-
Hello,
I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website.
I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware).
Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere).
This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience.
When we cut pages, though, we used a different approach, doing all of the below steps:
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages.When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way…
I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda.
So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions:
1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)?2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did?
Thank you in advance for your help,
Eric -
Thanks Dr Peter! I agree with you! Just wanted to feel shure about it.
Yes, Gary, you can personalize also a 410 page.
-
You should be able to customize a 410 just like you do a 404. The problem is that most platforms don't do that, by default, so you get the old-school status code page. That should be configurable, though, on almost all modern platforms.
-
From a commerce perspective the biggest problem I have with the 410 is the user experience. If I tag a URL with a 410 when someone request the page they get a white page that says GONE. They never even get the chance to see the store and maybe search for a similar product.
Would it work if I built a landing page that returns a 410 and then used the 301 to redirect the bad URL to the landing page? It would make the customer happy, they would be in the store with a message to search for something else. But would Google really associate the 410 with the redirected URL?
-
Hi Sandra, don't worry about 404s volume because they won't hurt your rankings.
About your issue I understand that you want to be really clear with your users and don't hurt their experience on the site. So create a custom 404 which changes its content depending of what page is returning it. If it's one of your old product you can return a message or an article of why you decided to remove them and propose some alternatives. For all other errors you can just return a search box or related products to the one you lost.
301 IMHO are not the way to go, if an url is gone it has not being redirected anywhere, so a 301 will result in a bad UX 99% of the time.
-
Hello,
I have a related question about 301 vs 410.
I have a client who wants to delete a whole category of product from one site. It's a big amount of product, so a big amount of urls, but this product is not working very well. So the decision is not SEO-related but more as a business decision. It's not for Panda.
If we think about the communication with the user, the best option would be to have a landing page explaining that we decided to remove that product.
Then the question is, do we do a redirect 301 of all those urls to this landing page? I am afraid that a big redirect like this, going from many urls to a single one (even if this is not created to rank on google) can be seen dodgy by Google. Am I right?
Or do I do a 410 for those pages, and I personalize the 410 landing only for these urls in order to communicate with the user (is that even possible?). But I am afraid, because we'll have much 4XX Errors in WMT, and this may have influence to the rankings!
So I don't know what to do! It's a must that we delete this content and that we communicate it well with the users.
Thanks for your help,
-
100% agreed - 403 isn't really an appropriate alternative to 404. I know SEOs who claim that 410s are stronger/faster, but I haven't seen great evidence in the past couple of years. It's harmless to try 410s, but I wouldn't expect miracles.
-
Hi Eric, I'll try to answer your further question even if I'm not an oracle like Pete
First of all thanks Pete to underline that you need to give google just one response since you can't give them both 301 and 404, I was assuming that and I didn't focus on that part of Eric's answer.
Second. Eric, If your purpose is to give google the ability of recrawl the old content to let them see it has disappeared you want to give them a 404 or a 410 which are respectively not found and permanently not found. Before it was a difference but now they've almost the same value under google's eyes (further reading). In that way google can access your page and see that those contents are now gone.
In the case of 403 the access is denied to anyone both google and humans, so in that case google won't be able to access and recrawl it. If your theory is based (and I think you're in the good way) upon the thing that google needs to recrawl your content and see it ahs really gone, 403 is not the response you should give it.
-
Hey there mememax - thank you for the reply! Reading your post and thinking back to our methodology, yes I think in hindsight we were a bit too afraid about generating errors when we removed content - we should have considered the underlying meaning of the different statuses more carefully. I appreciate your advice.
Eric
-
Hello Dr. Pete – thank you for the great info and advice!
I do have one follow-up question if that's ok – as we move forward cutting undesirable content and generate 4xx status for those pages, is there a difference in impact/effectiveness between a 403 and a 404? We use a CMS and un-publishing a page creates a 403 “Access denied” message. Deleting a page will generate a 404. I would love to hear your opinion about any practical differences from a Googlebot standpoint… does a 404 carry more weight when it comes to content removal, or are they the same to Googlebot? If there’s a difference and the 404 is better, we’ll go the 404 route moving forward.
Thanks again for all your help,
Eric
-
Let me jump in and clarify one small detail. If you delete a page, which would naturally result in a 404, but then 301-redirect that page/URL, there is no 404. I understand the confusion, but ultimately you can only have one HTTP status code. So, if the page properly 301s, it will never return a 404, even if it's technically deleted.
If the page 301s to a page that looks like a "not found" sort of page (content-wise), Google could consider that a "soft 404". Typically, though, once the 301 is in place, the 404 is moot.
For any change in status, the removal of crawl paths could slow Google re-processing those pages. Even if you delete a page, Google has to re-crawl it to see the 404. Now, if it's a high-authority page or has inbound (external) links, it could get re-crawled even if you cut the internal links. If it's a deep, low-value page, though, it may take Google a long time to get back and see those new signals. So, sometimes we recommend keeping the paths open.
There are other ways to kick Google to re-crawl, such as having an XML sitemap open with those pages in them (but removing the internal links). These signals aren't as powerful, but they can help the process along.
As to your specific questions:
(1) It's very tricky, in practice, especially at large-scale. I think step 1 is to dig into your index/cache (slice and dice with the site: operator) and see if Google has removed these pages. There are cases where massive 301s, etc. can look fishy to Google, but usually, once a page is gone, it's gone. If Google has redirected/removed these pages, and you're still penalized, then you may be fixing the wrong problem or possibly haven't gone far enough.
(2) It really depends on the issue. If you cut too deep and somehow cut off crawl paths or stranded inbound links, then you may need to re-establish some links/pages. If you 301'ed a lot of low-value content (and possibly bad links), you may actually need to cut some of those 301s and let those pages die off. I agree with @mememax that sometimes a helathy combination of 301s/404s is a better bet - pages go away, and 404s are normal if there's really no good alternative to the page that's gone.
-
Hi Eric, in my experience I've always found 4** better than 301 to solve this kind of issues.
Many people uses this response too much just because they want to show google that their site don't have any 404.
Just think about it a little, a 301 is a permanent redirect, a content which has just moved from one place to another. If you got a content you want to get rid of, do you want to give google the message "hey that low quality content is not where you found it but now it's here", no. You wan't to give google the message that the low quality content has been improved or removed. And a 404 is the right message to give him if you deleted that content.
It's prefectly normal to have 404s in a website, many 404 won't hurt your rankings, only if those pages were ranking already so users will receive a 404 instead and if some external sites were linking there in that case you may consider a 301.
While I think that google has a sort of a black list (and a white list too) I don't think that it has a memory of bad sites he encounters, if you fix your issues you'll start to rank again.
The issue you may have is not that you're site may be tainted but that maybe you still have some issues here and there which you didn't fix. As it seems Googlers said that Panda is now part of the algo so if you fix your issues you won't need any upgrade to start re ranking.
Hope this may have helped!! G luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Subdomain vs Subdirectory - does the content make a difference?
So I've read through all of the answers that suggest using a subdirectory is the best way to approach this - you rank more quickly and have all of your content on one site. BUT what if you're looking to move into a totally new market that your current site/content isn't in any way relevant to? Some examples are Supermarkets such as Tesco (who seem to use a mix of methods) http://www.tesco.com/groceries/, http://www.clothingattesco.com/, http://www.tesco.com/bank/ which links out from their main site to http://www.tescobank.com/ etc and Sainsburys http://www.sainsburys.co.uk/ who use subdomains - here they have their grocery offering, their bank offering, clothes, phones etc split into subdomains. If you have a product that is totally new to your Brand and different from all the products on your current site, does this change the answer to subdirectory vs subdomain? Would be great to hear your expert opinions on this. Thanks
Intermediate & Advanced SEO | | giffgaff2 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Manual Removal Request Versus Automated Request to Remove Bad Links
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
301 Redirection problems
A couple of days ago we did a restructure of our e-commerce site (wordpress + woocomerce) where some product categories needed to change names. I used Yoast SEO plugin to do 301 redirects in the .htaccess file.Today I noticed that we had two hits in the SERP on the phrase "dildos med vibrator". See the attached screenshot (first two results).One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibrator/ which is the right URL. One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildosdildos-med-vibrator-dildos-for-honom/ which is a corrupt URL that has never been in use. The old one we did a redirect from was /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/The command in the .htaccess file was: Redirect 301 /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/ http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibratorWhat has happened here? Why does the 301 create entirely new URL:s in the SERP?Tz0TULT.png
Intermediate & Advanced SEO | | kisen0 -
Permalink question
For 5 years I have used the permalink custom structure: /%postname% without the end backslash. I didn't think the difference was that big of a deal, yet last month I was curious of what benefits would happen if I made the change. To my surprise my rankings took a slight dive, but recovered stronger than before. As the URL itself doesn't require a redirect the posts and pages loaded the same with or wothout the "/" But now in Open Site Explorer, all my URL's have no page Authority. All the links i built were pointing to links without the backslash: example.com/post-name Questions: Did Google figure out the change, hence the dip in rankings and strong return? Will keeping /%postname%/ even though many links are pointing to a non backslash URL comeback to haunt me? Is there anything I can do to help lead Google to better see the changes I've made? thx
Intermediate & Advanced SEO | | MikePatch0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0