Should Your Keep Out Of Stock Item Active On Your Site ?
-
If you have sold out products that will never come back in stock. Should you remove the items and urls from your sitemap and site. Or should you keep them active with a sold out image. The purpose would be for search engines will think your site is larger due the products and amount of urls you have ?
-
Also, critical to remember a person who bought the product in the past may want to view their history for some reason (see descriptions and such).
-
Before you keep these types of pages you should determine if they have any links or any traffic. If they are pulling traffic that you would not otherwise receive then you can use the page to tell the history of the item (such as an antique or other one-of-a-kind piece).
If it is a standard product such as a pair of running shoes that has been replaced by a different model you can make the page informative and explain that the item was replaced by a new model and the impovements that were made.
Both of the above showcase your helpfulness and knowledge.
However, if this page has no links and pulls no traffic then delete it and redirect it. You don't need useless pages on your site.
-
I would keep the sold out items on my site and have a "we also recommend" section below the item in the inner page to show the client that you have items similar to the one that is sold out. I wouldn't add a sold out image; instead, I would code it so if an item is sold out, then the purchase button is replaced by a sold out button/image. Depending on the products that you carry, some of your sold out items may carry weight within SEO and may show up for relevant keyword searches; you don't want to lose that.
-
It depends on the product. If the item is unique, such as a book that is now out of print, we keep it on the site with a pop-up pointing people to either a new version or the related category.
If it is a t-shirt design, we remove the item and redirect it to the category page for the item unless there is a new item that is a direct replacement.
You have to consider your audience. People will probably be grateful to know that a particular book they were looking for is out of print but they probably don't care that a specific color and cut of shirt is unavailable.
-
I agree with Virage. As long as it makes business sense and the volume of your OOS products is not more then your current products. I would not want to have 500 OOS products and 50 In stock products on a site.
Stop linking to that page from your navigation (obviously) and then if somebody does indeed navigate to your OOS product page from elsewhere on the web, then they can see it's OOS and see related products and so on. That's a much better user experience in my opinion vs a 404.
-
I do keep out of stock product pages active for the very reason you mentioned: it's more unique content for the search engines to read, and also, if someone is searching for my out of stock item, I would still want them to find my site because it is very likely we would have a similar alternative option of which they may purchase instead.
If anything, our product pages always include a ton of information, including PDFs and pictures, that just seem helpful from a consumer's POV. Even if they do not end up purchasing said product from us, they can still research it with us!
It really thus depends on the nature of your website and your products, but there is great value in retaining unique content, so if your product page is filled with useful product information, I'd say definitely keep it available for the search engines and consider linking to alternative in-stock options for your visitors to pursue as well!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
Condensing content for web site redesign
We're working on a redesign and are wondering if we should condense some of the content (as recommended by an agency), and if so, how that will affect our organic efforts. Currently a few topics have individual pages for each section, such as (1) Overview (2) Symptoms and (3) Treatment. For reference, the site has a similar structure to http://www.webmd.com/heart-disease/guide/heart-disease-overview-fact. Our agency has sent us over mock-ups which show these topics being condensed into one and using a script/AJAX to display only the content that is clicked on. Knowing this, if we were to choose this option, that would result in us having to implement redirects because only one page would exist, instead of all three. Can anyone provide insight into whether we should keep the topic structure as is, or if we should take the agency's advice and merge all the topic content? *Note: The reason the agency is pushing for the merging option is because they say it helps with page load time. Thank you in advance for any insight! Tcd5Wo1.jpg
Algorithm Updates | | ATShock1 -
Penguin 3.0 Site Dropped after Update
Hi We was hit by the Penguin update a long time ago and we lost a lot of traffic/positions because of this. For a long time we worked really hard to identify all off our links that may have caused us to recieve this penalty. After Months of work we submitted the disavow file and reconsideration request and in June 2014 we recieved confirmation from google in webmaster tools that the manual spam action had been revoked. over time we then started to recieve more traffic and better positions in the serps, however since penguin 3.0 we have dropped again for a range of keywords. many going from page 1 to 2 or page 2 to 3/4 Any ideas what we should do here , any help will be really appriciated as I'm totally confused We havent done any link building at all since the penalty / recovery
Algorithm Updates | | AMG1000 -
Staging site - Treated as duplicate?
Last week (exactly 8 days ago to be precise) my developer created a staging/test site to test some new features. The staging site duplicated the entire existing site on the same server. To explain this better -My site address is - www.mysite.com The path of the new staging site was www.mysite/staging I realized this only today and have immediately restricted robot text and put a no index no follow on the entire duplicate server folder but I am sure that Google would have indexed the duplicate content by now? So far I do not see any significant drop in traffic but should I be worried? and what if anything can I do at this stage?
Algorithm Updates | | rajatsharma0 -
My site dropped from 1st to 5th Pagna google br
My site dropped from 1st to 5th Pagna google br mented in the key word, how can I find out why at and what to do to get back? He fell after he put the google analytic code on all pages of the site, it may have acid? Meu site caiu da 1º para a 5º Pagna do google br em tadas as palavra chaves, como posso descobrir o motivo e oque fazer para voltar ? Ele caiu depois que coloquei o codigo do google analytic em todas as paginas do site, pode ter cido isso ?
Algorithm Updates | | Guedes0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc . I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ? Note : My site is a US based site targeting US audience
Algorithm Updates | | Chaits0