Old proudct pages - eComm Site
-
Hello,
Geeks.com currently has approx. 194k pages in Google index. (approx. 30k suppl.)
We have many thousands of old product urls which have gone out of stock, never to "see the light of day" again. 14 years worth!
Should we be 301'ing all old products pages that go out of stock, if we know for certain we will never carry that SKU again?
If we were to do a "mass" 301 of 30k+ urls how would google or other SE's react to that?
Could there be any negative implications to doing so?
What is considered best practice for eComm sites, as I imagine we are not alone with this type of situation.
Thank you in advance.
Michael B.
-
Mike,
I agree with Alan that it is a serious issue that warrants some attention and planning. Worst case scenario, the expired pages return a 404 and you're missing a big opportunity to boost the rest of the site. Best case scenario, you 301 or link to category or cross-promotional pages to pass PR and visitors to the next most relevant page/category.
The 301 would accomplish this, but like Alan said you run the risk of inadvertently creating redirect loops if there's no long-term planning for potentially thousands of pages and/or categories.
-
if we're talking about thousands of pages falling off, yes, to me, that's a high priority. If you go the 301 route, they should go to the highest page in the chain that product would be associated with that's relevant to the topical intent and relative closeness of match..
So if it's a laser mouse, I wouldn't redirect to the top "desktop computers" or even the "laser mouse" category, but I would 301 it to the mouse optical/trackball category page.
The reason for this is two-fold - it's low enough in the food chain to be highly related, but not so highly related that if the current laser mouse sub-cat disappears altogether that you'd end up in a bad loop of redirects.
That does, then, maintains at least some of the original page authority and boost the parent category.
-
Replied above Andrew - thanks again!
-
Alan and Andrew - thank you for the thought out replies.
- How serious of an issue would you consider this in the first place. Meaning if this were a site you were maintaining would this be high on your list of priorities?
2) If we just did a simple 301 to the highest category from which the product lived within, would that accomplish what you are saying above Alan?
This product below will be out of stock within the next cpl days. There is value here as it's garnered a few root domain links ect....
Is it accurate that once product is out of stock and removed from site navigation these pages are no longer crawled and no longer part of the site's architecure, therefore the rest of the site will no longer benefit from the links they have accumulated over time?
Wouldn't we then want to best to preserve the pages authority and 301 it to boost the parent category?
ex: http://www.geeks.com/details.asp?InvtId=M261VP-R
Hopefully I am not confusing you too much! lol
I look forward to your response.
Thank You,
MB
-
Oh Hey Andrew - great link to Rand's Whiteboard Friday on that. I hadn't seen that one. Looks like he covered both our concepts.
-
Andrew's got one path to consider. I've got another. My own most recent example is with a real estate site that has 100,000 property pages that all currently result in a 404 not found. Yes, that's 100k dead pages. So I too feel your pain.
What I recommend to clients is to 301 based on category level criteria. So for example, whatever the highest level category a product had been in - that old page should 301 to the current category page, if one exists. The 301 should append the new URL with a unique identifier for this situation - something like #NLC (for no longer carried) - the # sign being the key, because you can then have an anchor at the top of the content area of those pages that if the referrer includes that #NLC in it, visitors would see a box communicating that the product is no longer carried, and inviting them to browse your current inventory in that category.
Doing this would also require having a canonical URL tag on each category page - just to cover the bases. While anything after the #sign should be ignored as far as causing duplicate content conflicts, it's still best practices to have the canonical URL there in the header.
When no current category exists, then I'd send visitors by 301 to a uniform page (either a product search page or otherwise) yet with the same #NLC string and message.
Of course, getting either Andrew's suggestion or mine implemented will be up to the skills of the programmers doing the implementation. That's a lot of coding that has to be done accurately and thoroughly tested.
-
I've had several large (100k+ pages) clients with similar issues, and despite all the usual "each case is different" disclaimer, I've seen a lot of success with keeping the out of stock items' URLs active, but replacing the content with a message "We're sorry, but this item is out of stock. Might we interest you in product X or content Y?" type of cross-promotion. Depending on your e-commerce platform's ability to dynamically generate different merchandising options, it may be difficult or easy.
You can choose whether or not to keep these pages in your navigation structure if you like, although I'd recommend removing them from your internal search results pages.
That's just my experience, but there's a great discussion thread on this Whiteboard Friday post (which I refer people to all the time): http://www.seomoz.org/blog/whiteboard-friday-expired-content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG Regarding: https://etcnaija.com
Technical SEO | | etcna0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Canconical tag on site with multiple URL links but only one set of pages
We have a site www.mezfloor.com which has a number of Url's pointing at one site. As the url's have been in use for many years there are links from many sources include good old fashioned hard copy advertising. We have now decided that it would be better to try to start porting all sources to the .co.uk version and get that listing as the prime/master site. A couple of days ago I went through and used canonical tags on all the pages thinking that would set the priority and that would also strengthen the page in terms of trust due to the reduced duplication. However when I went to scan the site in MOZ the warning that the page redirects came up and I am beginning to think that I need to remove all these canonical tags so that search engines do not get into a confused spiral where we loose the little page rank we have. Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at.
Technical SEO | | Eff-Commerce0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Help: Google Time Spent Downloading a Page, My Site is Slow
All, My site: http://www.nationalbankruptcyforum.com shows an average time spent downloading a page of 1,489 (in milliseconds) We've had spikes of well over 3,000 and lows of around 980 (all according to WMT). I understand that this is really slow. Does anyone have some suggestions as to how I could improve load times? Constructive criticism welcomed and encouraged.
Technical SEO | | JSOC0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0