ECommerce: Best Practice for expired product pages
-
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages.
We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be:
1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories).
2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404)
I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option.
Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
-
Unfortunately manually.
-
Yep, on two different sites we did thousands of redirects at a time with no issues. In one case it was annual and the other it was quarterly but I don't see any reason monthly would be any different.
Definitely post your findings after implementation or maybe even write a YouMoz post about what you find out!
-
Good luck
-
Thanks for your thoughts guys.
@Igal@Incapsula: I like your 302 idea! That might acutally make a lot of sense for some products that are short-lived.
@Matthew: Good to know that lots of 301s were not an issue on your sites. Are you talking about thousands of those, though?
Most importantly, I will have to find something that can be automated and doesn't require much extra-work. I will probably go for 301s and remove those after a few months
Remind me to post my learnings here after implementation:)
-
(+1) For redirect to main category page option. I did this several time, including for a very large tourism site which had a LOT of "inventory" changes (we are talking about dozens-hundreds/day) and had great results.
One thing I would like to suggest is to look into doing 302 and removing the redirects after 2-3 month.
The reason for this is purely practical. In our case, after just a few month, we were looking at many thousands of redirects and this is not something you want to "carry around".
My suggestion allows you to still make use of link juice for removed pages and, at the same time, have a manageable redirect profile.As a safe net you can have a generic: "404 >>> 301 >>> Homepage" rule underneath.
-
Hey,
In general, I would opt for option 1 as that would be the most scale-able solution. Whenever I've done this, I've not seen any issues with having lots of 301s appear. Given the shorter life span of those product pages you probably won't have lots of links going to those pages (or social, etc.) and I think that helps explain why I've not seen issues redirecting this many pages.
That being said, if you do have lots of links or social signals referencing a certain product page, that is when I'd opt for the custom page listing similar products. I've had success doing this for high-traffic product pages that have been removed as it can help maintain the sale. In terms of the signal, it really depends. If you are still offering unique content relevant to search queries and links referencing that page, I'd deliver a status 200 (it is still a good page worthy of attention). If the content isn't all that unique, and it is more for people (to maintain the sale) as opposed to search, I would have that page deliver a status 410 (saying it is gone).
I hope that helps!
Matthew
-
thanks Kevin, so you're also going with option 1.
Do you make those redirects manually, or does it run automated?
I should add that it's a Magento Webshop and we definitely need some automation since I am talking about hundreds of product pages.
-
We have a customize search page for each category. When a product has been discontinued, we do a 301 redirect those pages to the category search page.
We use to do a 301 redirect of list similar products (by doing a search and capturing the url with the search term), but it proved to be to time-consuming as these products did not traditionally sold that well and did not bring in much traffic.
Not saying it's the best way, but this is what we do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Product page Canonicalization best practice
I'm getting duplicate content errors in GWT for product list pages that look like this: -www.example.com/category-page/product
Technical SEO | | IceIcebaby
-www.example.com/category-page/product/?p=2 The "p=2" example already has a rel=canonical in place, " Shouldn't the non-canonical pages be using the canonical attribute for the first page rather than the additional product pages? Thanks!0 -
Rel canonical for partner sites - product pages only or also homepage and other key pages?
Hello there Our main site is www.arenaflowers.com. We also run a number of partner sites (eg: http://flowershop.cancerresearchuk.org/). We've relcanonical'd the products on the partner site back to the main (arenaflowers.com) site. eg: http://flowershop.cancerresearchuk.org/flowers/tutti_frutti_es_2013 rel canonicals back to: http://www.arenaflowers.com/flowers/tutti_frutti_es_2013). My question: Should we also relcanonical the homepage and other key pages on partner sites back to the main arenaflowers website too? The content is similar but not identical. We don't want our partner sites to be outranking the original (as is the case on kw flower delivery for example). (NB this situation may be complicated by the fact we appear to have an unnatural link penalty on af.com (and when we did an upgrade a while back, the af.com site fell out of the index altogether due to some issues with our move to AWS.) We're getting professional SEO advice on this but wondered what the Moz community's thoughts were.. Cheers, Will
Technical SEO | | ArenaFlowers.com0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Is it worth setting up 301 redirects from old products to new products?
This year we are using a new supplier and they have provided us a product database of approx. 5k products. About 80% of these products were in our existing database but once we have installed the new database all the URLs will have changed. There is no quick way to match the old products with the new products so we would have to manually match all 5k products if we were were to setup 301 rules for the old products pointing to the new products. Of course this would take a lot of time. So the options are: 1. Is it worth putting in this effort to make the 301 rules? 2. Or are we okay just to delete the old product pages, let the SE see the 404 and just wait for it to index the new pages? 3. Or, as a compromise, should we 301 the old product page to the new category page as this is a lot quicker for us do do than redirecting to the new product page?
Technical SEO | | indigoclothing0 -
SEOMoz Crawl Diagnostic indicates duplicate page content for home page?
My first SEOMoz Crawl Diagnostic report for my website indicates duplicate page content for my home page. It lists the home page URL Page Title and URL twice. How do I go about diagnosing this? Is the problem related to the following code that is in my .htaccess file? (The purpose of the code was to redirect any non "www" backlink referrals to the "www" version of the domain.) RewriteCond %{HTTP_HOST} ^whatever.com [NC]
Technical SEO | | Linesides
RewriteRule ^(.*)$ http://www.whatever.com/$1 [L,R=301] Should I get rid of the "http" reference in the second line? Related to this is a notice in the "Crawl Notices Found" -- "301 Permanent redirect" which shows my home page title as "http://whatever.com" and shows the redirect address as http://http://www.whatever.com/ I'm guessing this problem is again related to the redirect code I'm using. Also... The report indicates duplicate content for those links that have different parameters added to the URL i.e. http://www.whatever.com?marker=Blah Blah&markerzoom=13 If I set up a canonical reference for the page, will this fix this? Thank you.0 -
301 lots of old pages to home page
Will it hurt me if i redirect a few hundred old pages to my home page? I currently have a mess on my hands with many 404's showing up after moving my site to a new ecommerce server. We have been at the new server for 2 years but still have 337 404s showing up in google webmaster tools. I don't think it would affect users as very few people woudl find those old links but I don't want to mess with google. Also, how much are those 404s hurting my rank?
Technical SEO | | bhsiao1 -
Too many on page links for WP blog page
Hello, I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts? I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links. What can I do to rectify this? Many thanks in advance
Technical SEO | | mozUser14692366292850