301, 302, 404 or 410
-
Hello,
We have an ecommerce site and it's very normal for products to be discontinued by the manufacturer. We used to leave the pages up to keep the content and link equity. Now we feel this is misleading for the customer and we have started to remove pages for discontinued items. Customers trying to reach these pages get a nice Product Not Found page with an explanation and links to categories. The shopping cart sends a 302 code.
Google Webmaster Tools was complaining about "soft 404's" and apparently didn't like this. We tried changing to a 404 return code but couldn't get the nice Product Not Found page to display. Plus, GWT and SEOmoz started to complain about 404 errors.
I think we've reached a solution where we can send a 301 and still display the desired Product Not Found page. This might be the best solution. We'll see if we get errors from SEOmoz or GWT. However, a 410 return code would probably be most correct but we'd like to salvage any link equity we can but we really want to be "good citizens" and do things right.
Should we really be sending a 410 in this case even if we lose seo equity or are we OK with the 301 and the nice information page?
Thanks,
Tom
-
I agree with David and Moosa.
Redirect to the category page if it is genuinely useful for your visitor. So in some cases you might want to put user experience above retaining link juice within that category and either redirect elsewhere or create a custom page.
-
The best idea is to 301 to removed product’s category page or a custom page that explains why the page does not exist anymore what are the ideal paces on the website to go and find similar products....
-
I agree with David
-
410 will kill the link equity. In cases similar to yours, I have implemented a 301 to the category page of the product removed. This gives users products similar to the one they were looking for, and retains link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deleting 301 Redirect URLs from the CMS
Hi Everyone, Would there be a negative SEO effect from deleting pages with 301 redirects in your CMS? Does anyone know of an average time of authority transfer from a redirect? Thanks,
White Hat / Black Hat SEO | | JMSCC
Jon0 -
Buy exact match domain and 301 worth it?
So there is this exact match domain that gets about 500 visitors a day. it has trust flow 17 and citation flow of 23 which is just a little lower than our own website. The website talks about one of our keywords and rank on second page in SERPs. I am not interested in buying and running that website, but rather just to liquidate all the pages with 301s into our existing domain and onto relevant pages. So the 301s would be to relevant pages. The question is, would this strategy be worth it in todays SEO world and Google updates?
White Hat / Black Hat SEO | | TVape0 -
301 redirects for 3 top level domains using WP SEO Yoast
Hey Guys I have a custom built website - and a wp blog attached to this - problem is there are 3 top level domains: zenory.co.nz, zenory.com and zenory.com.au **The issue is when I enter the domain to 301 redirect I only have to enter one domain usually i enter redirect from zenory.com/blog/oldpage to zenory.com.newpage ** For eg: I have just move Phone Psychic Readings from the blog - over to the main site. However there seems to be an issue that I'm still having and trying to clean up. I'm finding backlinks there are linking to each other of my 3 domains that end up backlinking across domains, which I was told this can look as spammy to google. For eg: co.nz links many pages to com.au. I'm currently trying to clean this up at the moment - however while im in the process of this - I find myself question when I'm creating the 301 redirects from the blog - but lets say I'm on the blog for zenoy.co.nz/blog/oldblogpost and when I click on a blog post - it redirects me to zenory.com/newarticlepost - because I have redirected it to .com - how can I redirect and make sure is going back to the right domain name to save myself from having to show this cross backlinks? Would gratefully appreciate any assistance on this tricky situation. Cheers Just
White Hat / Black Hat SEO | | edward-may0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
Link Farms and The Relationship between 2 domain with a 301 Redirect
I have an interesting scenario: Domain A was worked on by a disreputable SEO company off shore. The owner of Domain A came to me for my assistance and evaluation on how the off shore company was doing. I concluded that he should terminate the relationship immediately. One of the bad things they did was register Domain A with a LOT of link farms. I started working on a new site that eventually we decided to go with Domain B (a better, but totally related domain name to Domain A). I added a nice new site and had my client write clean, relevant information for it. We've done all legitimate, above ground by-google's-recommendation SEO for Domain B. I have a series of 301 redirects from Domain A to Domain B. Since April 24th, organic search results have plummeted. I see many incoming links via Webmaster Tools as the massive link farms, but those link farms have Domain A in their databases, not Domain B. My question: is Domain B inheriting the link juice from Domain A insofar as the incoming links are showing up in Webmaster Tools as directly related to Domain A? Should I sever the ties with Domain A altogether? Thanks.
White Hat / Black Hat SEO | | KateZDCA1 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0