ECommerce: Best Practice for expired product pages
-
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages.
We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be:
1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories).
2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404)
I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option.
Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
-
Unfortunately manually.
-
Yep, on two different sites we did thousands of redirects at a time with no issues. In one case it was annual and the other it was quarterly but I don't see any reason monthly would be any different.
Definitely post your findings after implementation or maybe even write a YouMoz post about what you find out!
-
Good luck
-
Thanks for your thoughts guys.
@Igal@Incapsula: I like your 302 idea! That might acutally make a lot of sense for some products that are short-lived.
@Matthew: Good to know that lots of 301s were not an issue on your sites. Are you talking about thousands of those, though?
Most importantly, I will have to find something that can be automated and doesn't require much extra-work. I will probably go for 301s and remove those after a few months
Remind me to post my learnings here after implementation:)
-
(+1) For redirect to main category page option. I did this several time, including for a very large tourism site which had a LOT of "inventory" changes (we are talking about dozens-hundreds/day) and had great results.
One thing I would like to suggest is to look into doing 302 and removing the redirects after 2-3 month.
The reason for this is purely practical. In our case, after just a few month, we were looking at many thousands of redirects and this is not something you want to "carry around".
My suggestion allows you to still make use of link juice for removed pages and, at the same time, have a manageable redirect profile.As a safe net you can have a generic: "404 >>> 301 >>> Homepage" rule underneath.
-
Hey,
In general, I would opt for option 1 as that would be the most scale-able solution. Whenever I've done this, I've not seen any issues with having lots of 301s appear. Given the shorter life span of those product pages you probably won't have lots of links going to those pages (or social, etc.) and I think that helps explain why I've not seen issues redirecting this many pages.
That being said, if you do have lots of links or social signals referencing a certain product page, that is when I'd opt for the custom page listing similar products. I've had success doing this for high-traffic product pages that have been removed as it can help maintain the sale. In terms of the signal, it really depends. If you are still offering unique content relevant to search queries and links referencing that page, I'd deliver a status 200 (it is still a good page worthy of attention). If the content isn't all that unique, and it is more for people (to maintain the sale) as opposed to search, I would have that page deliver a status 410 (saying it is gone).
I hope that helps!
Matthew
-
thanks Kevin, so you're also going with option 1.
Do you make those redirects manually, or does it run automated?
I should add that it's a Magento Webshop and we definitely need some automation since I am talking about hundreds of product pages.
-
We have a customize search page for each category. When a product has been discontinued, we do a 301 redirect those pages to the category search page.
We use to do a 301 redirect of list similar products (by doing a search and capturing the url with the search term), but it proved to be to time-consuming as these products did not traditionally sold that well and did not bring in much traffic.
Not saying it's the best way, but this is what we do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Safety Data Sheet PDFs are Showing Higher in Search Results than Product Pages
I have a client who just launched an updated website that has WooCommerce added to it. The website also has a page of Safety Data Sheets that are PDFs that contain information about some of the products. When we do a Google search for many of the products the Safety Data Sheets show up first in the search results instead of the product pages. Has anyone had this happen and know how to solve the issue?
Technical SEO | | teamodea0 -
Does a number of products in anchor text (link to product list page) have any influence on SEO?
For example: shower cabins (660), used in our onpage-navigation which links to a product list page.
Technical SEO | | Maxaro.nl0 -
How come only 2 pages of my 16 page infographic are being crawled by Moz?
Our Infographic titled "What Is Coaching" was officially launched 5 weeks ago. http://whatiscoaching.erickson.edu/ We set up campaigns in Moz & Google Analytics to track its performance. Moz is reporting No organic traffic and is only crawling 2 of the 16 pages we created. (see first and third attachments) Google Analytics is seeing hundreds of some very strange random pages (see second attachment) Both campaigns are tracking the url above. We have no idea where we've gone wrong. Please help!! 16_pages_seen_in_wordpress.png how_google_analytics_sees_pages.png what_moz_sees.png
Technical SEO | | EricksonCoaching0 -
Best practices for repetitive job postings
I have a client who is a recruiter for skilled trades jobs. They post quite a few jobs on their job board on a regular basis. They frequently have job postings that are very similar to older jobs or multiple current job postings that are similar to each other. Looking at their webmaster tools and site: command search in google, it does appear they have some duplicate content issues. We're thinking it's because of the similar job posts. What is the best practice for dealing with this? And is there any way to correct the situation so that the number of "omitted due to similarity" results declines? Thanks for you help!
Technical SEO | | PlusROI0 -
Dynamic page
I have few pages on my site that are with this nature /locator/find?radius=60&zip=&state=FL I read at Google webmaster that they suggest not to change URL's like this "According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " ) _http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
Technical SEO | | ciznerguy
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems" I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.0 -
Should I delete a page or remove links on a penalized page?
Hello All, If I have a internal page that has low quality links point to it or a penality. Can I just remove the page, and start over versus trying to remove the links? Over time wouldn't this page disapear along with the penalty on that page? Kinda like pruning a tree? Cutting off the junk limbs so other could grow stronger, or to start new fresh ones. Example: www.domain.com Penalized Internal Page: (Say this page is penalized due to keyword stuffing, and has low quality links pointing to it like blog comments, or profiles) www.domain.com/penalized-internal-page.com Would it be effective to just delete this page (www.domain.com/penalized-internal-page.com) and start over with a new page. New Internal Page: www.domain.com/new-internal-page.com I would of course lose any good links point to that page, but it might be easier then trying to remove old back links. Thoughts? Thanks! Pete
Technical SEO | | Juratovic0 -
Page MozRank and MozTrust 0 for Home Page, Makes No Sense?
Hey Mozzers! I'm a bit confused by a site that is showing a 0 for home page MozRank and MozTrust, while its subdomain and root domain metrics look decent (relatively). I am posting images of the page metrics and subdomain metrics to show the disparity: http://i.imgur.com/3i0jq.png http://i.imgur.com/ydfme.png Is it normal to see this type of disparity? The home page has very little inbound links, but the big goose egg has me wondering if there is something else going on. Has anyone else experienced this? Or, does anyone have speculation as to why a home page would have a 0 MozRank while the subdomain metrics look much better? Thanks!
Technical SEO | | ClarityVentures0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0