Should Your Keep Out Of Stock Item Active On Your Site ?
-
If you have sold out products that will never come back in stock. Should you remove the items and urls from your sitemap and site. Or should you keep them active with a sold out image. The purpose would be for search engines will think your site is larger due the products and amount of urls you have ?
-
Also, critical to remember a person who bought the product in the past may want to view their history for some reason (see descriptions and such).
-
Before you keep these types of pages you should determine if they have any links or any traffic. If they are pulling traffic that you would not otherwise receive then you can use the page to tell the history of the item (such as an antique or other one-of-a-kind piece).
If it is a standard product such as a pair of running shoes that has been replaced by a different model you can make the page informative and explain that the item was replaced by a new model and the impovements that were made.
Both of the above showcase your helpfulness and knowledge.
However, if this page has no links and pulls no traffic then delete it and redirect it. You don't need useless pages on your site.
-
I would keep the sold out items on my site and have a "we also recommend" section below the item in the inner page to show the client that you have items similar to the one that is sold out. I wouldn't add a sold out image; instead, I would code it so if an item is sold out, then the purchase button is replaced by a sold out button/image. Depending on the products that you carry, some of your sold out items may carry weight within SEO and may show up for relevant keyword searches; you don't want to lose that.
-
It depends on the product. If the item is unique, such as a book that is now out of print, we keep it on the site with a pop-up pointing people to either a new version or the related category.
If it is a t-shirt design, we remove the item and redirect it to the category page for the item unless there is a new item that is a direct replacement.
You have to consider your audience. People will probably be grateful to know that a particular book they were looking for is out of print but they probably don't care that a specific color and cut of shirt is unavailable.
-
I agree with Virage. As long as it makes business sense and the volume of your OOS products is not more then your current products. I would not want to have 500 OOS products and 50 In stock products on a site.
Stop linking to that page from your navigation (obviously) and then if somebody does indeed navigate to your OOS product page from elsewhere on the web, then they can see it's OOS and see related products and so on. That's a much better user experience in my opinion vs a 404.
-
I do keep out of stock product pages active for the very reason you mentioned: it's more unique content for the search engines to read, and also, if someone is searching for my out of stock item, I would still want them to find my site because it is very likely we would have a similar alternative option of which they may purchase instead.
If anything, our product pages always include a ton of information, including PDFs and pictures, that just seem helpful from a consumer's POV. Even if they do not end up purchasing said product from us, they can still research it with us!
It really thus depends on the nature of your website and your products, but there is great value in retaining unique content, so if your product page is filled with useful product information, I'd say definitely keep it available for the search engines and consider linking to alternative in-stock options for your visitors to pursue as well!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think profanity in the content can harm a site's rankings?
In my early 20's I authored an ebook that provides men with natural ways to improve their ahem... "bedroom performance". I'm now in my mid 30s, and while it's not such an enthralling topic, the thing makes me 80 or so bucks a day on good days, and it actually works. I update the blog from time to time and build links to it on occasion from good sources. I've carried my SEO knowledge to a more "reputable" business, but this project is still interesting to me, because it's fully mine. I am more interested in getting it to rank and convert than anything, but following the same techniques that are working to grow the other business, this one continues to tank. Disavow bad links, prune thin content.. no difference. However, one thing I just noticed now are my search queries in the reports. When I first started blogging on this, I was real loose with my tongue, and spoke quite frankly (and dirty to various degrees). I'm much more refined and professional in how I write now. However, the queries I'm ranking for... a lot of d words, c words (in the sex sense)... sounds almost pornographic. Think Google may be seeing this, and putting me lower in rankings or in some sort of lower level category because of it? Heard anything about google penalizing for profanity? I guess in this time of authority and trust, that can hurt both of those... but I wonder if anyone's heard any actual confirmation of this or has any experience with this? Thanks!
Algorithm Updates | | DavidCapital0 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
Site refuses to improve rankings. Can someone else put a set of eyes on this for me and see what I am missing?
Hello! We've been successful with over 40 clients and getting them to great results in our industry, insurance. We recently acquired a new client who had an existing website with prior SEO results a very spammy blog and many spammy links. We've removed many of the blog articles and links using the Google Disavow Tool We've been monitoring this site in a campaign on Moz, but we're seeing zero improvement week to week. Can someone put another set of eyes on this and see if we're simply just missing something? Results for all 30 of our tracked keywords, zero are in the top 50! I would guess this was an algorithm penalty, but it has been 3 months now since we've made the changes and nothing is changing... not even a little bit! Any help/suggestions would be GREATLY appreciated. Thank you and enjoy Labor Day weekend!
Algorithm Updates | | Tosten0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Site Speed
I was wondering what benefits there are to investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this? Thanks
Algorithm Updates | | MichealGooden0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610