What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
-
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years.
We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items?
Your help and thoughts is much appreciated.
-
James, I would still make these as out of stock.
If these products don't get any organic search or traffic anyway, it is ok to re-direct them.
The message above was for established products that have been indexed by Google over a long period of time.
Please le the know if you have any questions. Also, if someone answer the question to your satisfaction you should mark the comment as a good comment
-
These are not out of stock products. These are items that don't sell and have not sold in years; We have listings older than 5yrs and do not have any sales at all.
You would mark them as out of stock?
-
Hi Cole
These are not out of stock products. These are items that don't sell and have not sold in years; We have listings older than 5yrs and do not have any sales at all.
You would mark them as out of stock?
-
I have countless clients that get HUGE traffic form products that they have "discontinued"
You worked so hard to get those products to display on Google, why would you throw away all of your traffic with a 301 redirect to a different product causing high bounce rates or even worse taking your visitors to a discontinued product page.
I would simply put an "Out of Stock" notice on that product and have related products below to direct your customers to similar products or maybe an add to waitlist, so if you decide to bring the product back you have immediate customers.
Amazon is a perfect example. For the most part, they do not delete or remove products. If you search a product that is no longer in stock at Amazon it will say out of stock, still allowing you to see multiple reviews on that product or other sellers offering similar products.
-
Hey,
If a product is out-of-stock temporarily, best practice is to link to alternative products, for example:
- Newer models or versions.
- Similar products from other brands.
- Other products in the same category that match in quality and price.
- The same product in different colours.
This provides a good service to customers and helps search engines find and understand related pages easier.
If a product is out-of-stock permanently there are three main options.
1: Product returns a 410 (or 404) Not Found status.
Google understands 410 and 404 Not Found pages are inevitable, but the problem with creating too many of them is it reduces the time search engine crawlers will spend visiting the pages that actually should rank. If this option is implemented, ideally there should be signposts to related products on the Not Found page.2. 301 permanently redirect old product to existing product (e.g. newer version or close alternative).
A dynamically generated message should clearly display on the page e.g. “Product X is no longer available. This is a similar product/the replacement product.”This option is recommended if redirect chains can be minimised, e.g. if product turnover is high the following could happen in a short timeframe:
- Product 1 no longer exists and gets 301 redirected to Product 2.
- Product 2 no longer exists and gets 301 redirected to Product 3.
- Now a redirect chain exists: Product 1 redirects to Product 2 which then redirects to Product 3. Product 1 would need to be updated to redirect to Product 3, without the intermediate redirect to Product 2.
3. 301 permanently redirect old product to parent category. A dynamically generated message should clearly display on the page e.g. “Product X is no longer available. Please see similar products below.”
As categories are likely to change less often than products, this is potentially easier to implement than option 2.
-
I'd 301 redirects from the discontinued lines to the main section pages, so
https://www.domain.com/product-type/a-red-sweater
would redirect to
https://www.domain.com/product-type/
-
Can't speak for everyone, but i had this same thing come up with our eCommerce website. We added a feature to our eCommerce store that allowed us to "discontinue" the product. Meaning that we removed the product from being searched or listed in our store. However, if you visited the page by direct URL the product page would load and say discontinued and display a list of related products in hopes the customer would not bounce.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Am I over "Optimising My Site" or following "Best Practice"
Hi We're developing our site an wanted to ask if we are "over optimising" or following best practice. Maybe you have some recommendations. I've provided 4 examples below. Eventually we'll use Moz on page grader but as a new start up, I'd appreciate your help. Thank you, Faye. 1. URL: http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/ PAGE TITLE: Cipressa | Engineered Brown Wood | The Wood Galleries H1: Cipressa – Engineered Brown Wood KEYWORD: Engineered Brown Wood META: Buy Cipressa Brown Engineered Wood, available at The Wood Galleries, London. Provides an Exceptional Foundation for Elegant Décor & Extravagant Furnishings. IMAGE TAG: Brown Engineered Flooring KEYWORD IN BODY CONTENT: YES (1) 2. URL: http://www.thewoodgalleries.co.uk/engineered-wood/beiges/mauro/ H1: Mauro | Beige Engineered Wood | The Wood Galleries PAGE TITLE: Mauro – Beige Engineered Wood KEYWORD: Beige Engineered Wood META: Buy Mauro Beige Engineered Wood Flooring, available at The Wood Galleries, London. Designed to deliver Rich, Dark Undertones with Light hues of Muted Brown. IMG TAG: Beige Wood Flooring KEYWORD IN BODY CONTENT: YES (2) **3. URL: http://www.thewoodgalleries.co.uk/engineered-wood/beiges/vela-oak/ ** H1: Vela – Beige Engineered Oak PAGE TITLE: Vela | Beige Engineered Oak | The Wood Galleries KEYWORD: Beige Engineered Oak META: Buy Vela Beige Engineered Oak Wood, available at The Wood Galleries, London. Crafted from the most widely respected hardwoods in the world. IMG TAG: Engineered Oak Flooring KEYWORD IN BODY CONTENT: YES (1) 4. URL: http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks/ciro-rustic/ H1: Ciro – Engineered Rustic Wood PAGE TITLE: Ciro | Engineered Rustic Wood | The Wood Galleries KEYWORD: Engineered Rustic Wood META: Buy Ciro Engineered Rustic Wood, at The Wood Galleries, London. Its stylishly classic oak look exudes a sense of luxury that is simply undeniable. IMG TAG: Dark Wood Flooring, The Wood Galleries KEY WORD IN BODY CONTENT: YES (2)
White Hat / Black Hat SEO | | Faye2340 -
I think My Site Has Been Hacked
I am working with a client and have noticed lots of 500 server errors that look very strange in their webmaster tools account. I am seeing URLs like this blog/?tag=wholesale-cheap-nfl-jerseys-free-0702.html and blog/?tag=nike-jersey-shorts-4297.html there are 155 similar pages yet the client does not sell anything like this and hasn't created these URLs. I have updated WP and all plugins and cannot find these links or pages on the site anywhere but I am guessing they are slowing the site down as GWT keeps highlighting them as errors. Has anybody had any experiences with these types of hacks and can point me in the right direction of how to clean it up properly? Ta
White Hat / Black Hat SEO | | fazza470 -
Changes to SEO with disavow?
Has the game changed a lot with the disavow tool I can see people still saying check out what our competitors are doing but with just going through a disavow myself how do you actually know what the correct link diversity is as 0 - 100% of the links could be disavowed. Also could a competitor not just buy a load of spammy links and disavow them to mask there real links. (I know in my backlinks on 150 are good and the rest is disavowed crap)
White Hat / Black Hat SEO | | BobAnderson0 -
Site review
Can any one give me a quick site review, recently started work for the company on the seo, just want to asking if I am missing anything that may hinder SEO and SERPs etc www.teamac.co.uk
White Hat / Black Hat SEO | | TeamacPaints0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0