Hundreds of thousands of 404's on expired listings - issue.
-
Hey guys,
We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000.
Many of these listings receive links.
Classified listings that are less than 45 days show other possible products to buy based on an algorithm.
It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results.
-> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised.
-> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience.
-> Or, shall we just leave them as 404's? : google sort of says it's ok
Very curious on your opinions, and how you would handle this.
Cheers,
Croozie.
P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
-
Wow! Thanks Ryan.
I'm sure it won't surprise you to know that I'm always reading eagerly when I see you respond to a question as well.
-
Thanks Ian, good to know Again, good confirmation.
-
Hi Sha,
Spot on. Yes that was my original thinking, then I switched to the school of 200's with meta index's. But having you guys confirming this, makes me realise that doing 301's to the parent category is most certainly the way to go.
Permanently redirecting will have the added benefit of effectively 'de-indexing' the original classified's and of course throwing a ton of link juice over to the category levels.
What a wonderful, helpful community!
Many thanks,
Croozie.
-
Sha, your responses continuously offer outstanding actionable items which offer so much value. I love them so much as they offer such great ideas and demonstrate a lot of experience.
-
Hi Croozie,
Awesome work once again from Ryan!
Since your question feels like a request for suggestions on "how" to create a solution, just wanted to add the following.
When you say "classified listings" I hear "once off, here for a while, gone in 45 days content".
If that is the case, then no individual expired listing will ever be matched identically with another (unless it happens to be a complete duplicate of the original listing).
This would mean that it would certainly be relevant to send any expired listing to a higher order category page. If your site structure is such that you have a clear heirarchy, then this is very easy to do.
For example:
If your listing URL were something like http://www.mysite.com/listings/home/furniture/couches/couch-i-hate.php, then you can use URL rewrites to strip out the file name and 301 the listing to http://www.mysite.com/listings/home/furniture/couches/, which in most cases will offer a perfectly suitable alternative for the user.
There is another alternative you could consider if you have a search program built in - you could send the traffic to a relevant search. In the above example, mysite.com/search.php?s=couch.
Hope that helps,
Sha
-
We are now doing something similar with our site. We have several thousand products that have been discontinued and didn't think about how much link juice we were throwing away until we got Panda pounded. It's amazing how many things you find to fix when times get tough.
We started with our most popular discontinued products and are 301 redirecting them to either a new equivalent or the main category if no exact match can be found.
We are also going to be reusing the same product pages for annual products instead of creating new pages each year. Why waste all that link juice from past years?
-
If you perform a redirect, I recommend you offer a 301 header response, not a 200. The 301 response will let Google and others know the URL should be updated in their database. Google would then offer the new URL in search results. Additionally any link value can be properly forwarded to the new page.
-
Thanks Ryan,
Massive response! Awesome!
It's interesting that you talk a lot about the 301's.
Are you suggesting this would be far more preferable than simply producing a 200 status code page, listing product choices based on an algorithm - which we currently offer our customers for listings expired less than 45 days?
I suppose, to clarify, I'm worried that if we were to do that (produce 200 status code pages), then crawl equity would be reduced for Google, that we would be wasting a lot of their bandwidth on 200 status pages, when they could be better off crawling and indexing more recent pages.
Whereas with 301's to relevant products as you suggest, we solve that issue.
BTW, our 404 pages offer the usual navigation and search options.
Cheers,
Croozie.
-
Hi Croozie.
The challenge with your site is the volume of pages. Most large sites with 100k+ pages have huge SEO opportunities. Ideally you need a team which can manually review every page of your site to ensure it is optimized correctly. Such a team would be a large expense which many site owners choose to avoid. The problem is your site quality and SEO are negatively impacted.
Whenever a page is removed from your site or otherwise becomes unavailable, a plan should be in place PRIOR to removing the page. The plan should address the simple question: how will we handle traffic to the page whether it is from a search engine or a person who bookmarked the page or a link. The suggested answer is the same whether your site has 10 pages or a million pages:
- if the product is being replaced with a very similar product, or you have a very similar product, then you can choose to 301 the page to the new product. If the product is truly similar, then the 301 redirect is a win for everyone.
Example A: You offer a Casio watch model X1000. You stop carrying this watch and replace it with Casio watch model X1001. It is the same watch design but the new model has a slight variation such as a larger dial. Most users who were interested in the old page would be interested in the new page.
Example B: You offered the 2011 version of the Miami Dolphins T-shirt. It is now 2012 and you have the 2012 version of the shirt which is a different design. You can use a 301 to direct users to the latest design. Some users may be unhappy and want the old design, but it is still probably the right call for most users.
Example You discontinue the Casio X1000 and do not have a very close replacement. You could 301 the page to the Casio category page, or you could let it 404.
The best thing to do in each case is to put on your user hat and ask yourself what would be the most helpful thing you can do to assist a person seeking the old content. There is absolutely nothing wrong with allowing a page to 404. It is a natural part of the internet.
One last point. Be sure your 404 page is optimized, especially considering how many 404s you present. The page should have the normal site navigation along with a search function. Help users find the content they seek.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
My site has a loft of leftover content that's irrelevant to the main business -- what should I do with it?
Hi Moz! I'm working on a site that has thousands of pages of content that are not relevant to the business anymore since it took a different direction. Some of these pages still get a lot of traffic. What should I do with them? 404? Keep them? Redirect? Are these pages hurting rankings for the target terms? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
Was anyone hit by BOTH the 'Phantom' update as well as Penguin 2.0?
I'm interested to know if Phantom was just a "pre-Penguin" 2.0 or if it was a completely different update. Thoughts?
Intermediate & Advanced SEO | | nicole.healthline0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Will implementing a 'Scroll to Div Anchor' cause a duplicate content issue?
I have just been building a website for a client with pages that contain a lot of text content. To make things easier for site visitors I have created a menu bar that sticks to the top of the page and the page will scroll to different areas of content (i/e different Div id anchors) Having done this I have just had the thought that this might inadvertently introduce duplicate content issue. Does anyone know if adding an #anchor to the end of a url will cause a duplicate content error in google? For example, would the following URLs be treated as different:- http://www.mysite.co.uk/services
Intermediate & Advanced SEO | | AdeLewis
http://www.mysite.co.uk/services#anchor1
http://www.mysite.co.uk/services#anchor2
http://www.mysite.co.uk/services#anchor3
http://www.mysite.co.uk/services#anchor4 Thanks.0 -
Ranking for our member's company names without giving them all away!
Hi, We have a directory of 25,000 odd companies who use our site. We have a strong PR site and want to rank a page for each company name. Some initial testing on one or two company names brings us to #2 after the company's own web site in the format: "Company Name Reviews and Feedback" - so it works well. We want to do this for all 25,000 of our members, however we do not wish to make it easy for our competitors to scrape through our member database!! e.g. using: www.ourdomain.com/randomstring/company-name-(profile).php unfortunately with the above performing a search on google for site:domain.com/()/()(profile).php would bring up all records. Are there any tried and tested ways of achieving what we're after here? Many Thanks.
Intermediate & Advanced SEO | | sssrpm0