301, 302, 404 or 410
-
Hello,
We have an ecommerce site and it's very normal for products to be discontinued by the manufacturer. We used to leave the pages up to keep the content and link equity. Now we feel this is misleading for the customer and we have started to remove pages for discontinued items. Customers trying to reach these pages get a nice Product Not Found page with an explanation and links to categories. The shopping cart sends a 302 code.
Google Webmaster Tools was complaining about "soft 404's" and apparently didn't like this. We tried changing to a 404 return code but couldn't get the nice Product Not Found page to display. Plus, GWT and SEOmoz started to complain about 404 errors.
I think we've reached a solution where we can send a 301 and still display the desired Product Not Found page. This might be the best solution. We'll see if we get errors from SEOmoz or GWT. However, a 410 return code would probably be most correct but we'd like to salvage any link equity we can but we really want to be "good citizens" and do things right.
Should we really be sending a 410 in this case even if we lose seo equity or are we OK with the 301 and the nice information page?
Thanks,
Tom
-
I agree with David and Moosa.
Redirect to the category page if it is genuinely useful for your visitor. So in some cases you might want to put user experience above retaining link juice within that category and either redirect elsewhere or create a custom page.
-
The best idea is to 301 to removed product’s category page or a custom page that explains why the page does not exist anymore what are the ideal paces on the website to go and find similar products....
-
I agree with David
-
410 will kill the link equity. In cases similar to yours, I have implemented a 301 to the category page of the product removed. This gives users products similar to the one they were looking for, and retains link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remedies, Cure, and Precautions for 302 redirect Hijacking.
Hi Moz Guys, I hope all of you are good out there. I am here to discuss remedies, cure, and precautions for 302 redirect hijacking. Although it is quite old and whenever I searched in Google, it looks like a long gone glitch of Google serps but it just happened to one of my customers' site. The site in question is www(dot)solidswiss(dot)cd. If you check the cache(cache:site) then you can see a hijacked site in the urls of the cached page. As a result all my customer's listing in the serps are replaced with this site. This hacked site then is redirecting to a competitor's site. I did many things to cop with the problem, site came back in the serps but hackers are doing this on lots of domains so when it recovered from one site then another site catches it. I am doing lots of reporting on submit spam site. I am doing lots of feedback on the serps page. I have switched to https . But seems like nothing is working. This community is full of experts and technical people. I am wondering that what are your views and suggestions to handle the problem permanently?
White Hat / Black Hat SEO | | adqas0 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
Site architecture change - +30,000 404's in GWT
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future. But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated. Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
'Stealing' link juice from 404's
As you all know, it's valuable but hard to get competitors to link to your website. I'm wondering if the following could work: Sometimes I spot that a competitor is linking to a certain external page, but he made a typo in the URL (e.g. the competitor meant to link to awesomeglobes.com/info-page/ but the link says aewsomeglobes.com/info-page/). Could I then register the typo domain and 301 it to my own domain (i.e. aewsomeglobes.com/info-page/ to mydomain.com/info-page/) and collect the link juice? Does it also work if the link is a root domain?
White Hat / Black Hat SEO | | RBenedict0 -
Link Farms and The Relationship between 2 domain with a 301 Redirect
I have an interesting scenario: Domain A was worked on by a disreputable SEO company off shore. The owner of Domain A came to me for my assistance and evaluation on how the off shore company was doing. I concluded that he should terminate the relationship immediately. One of the bad things they did was register Domain A with a LOT of link farms. I started working on a new site that eventually we decided to go with Domain B (a better, but totally related domain name to Domain A). I added a nice new site and had my client write clean, relevant information for it. We've done all legitimate, above ground by-google's-recommendation SEO for Domain B. I have a series of 301 redirects from Domain A to Domain B. Since April 24th, organic search results have plummeted. I see many incoming links via Webmaster Tools as the massive link farms, but those link farms have Domain A in their databases, not Domain B. My question: is Domain B inheriting the link juice from Domain A insofar as the incoming links are showing up in Webmaster Tools as directly related to Domain A? Should I sever the ties with Domain A altogether? Thanks.
White Hat / Black Hat SEO | | KateZDCA1 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0