Hundreds of thousands of 404's on expired listings - issue.
-
Hey guys,
We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000.
Many of these listings receive links.
Classified listings that are less than 45 days show other possible products to buy based on an algorithm.
It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results.
-> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised.
-> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience.
-> Or, shall we just leave them as 404's? : google sort of says it's ok
Very curious on your opinions, and how you would handle this.
Cheers,
Croozie.
P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
-
Wow! Thanks Ryan.
I'm sure it won't surprise you to know that I'm always reading eagerly when I see you respond to a question as well.
-
Thanks Ian, good to know Again, good confirmation.
-
Hi Sha,
Spot on. Yes that was my original thinking, then I switched to the school of 200's with meta index's. But having you guys confirming this, makes me realise that doing 301's to the parent category is most certainly the way to go.
Permanently redirecting will have the added benefit of effectively 'de-indexing' the original classified's and of course throwing a ton of link juice over to the category levels.
What a wonderful, helpful community!
Many thanks,
Croozie.
-
Sha, your responses continuously offer outstanding actionable items which offer so much value. I love them so much as they offer such great ideas and demonstrate a lot of experience.
-
Hi Croozie,
Awesome work once again from Ryan!
Since your question feels like a request for suggestions on "how" to create a solution, just wanted to add the following.
When you say "classified listings" I hear "once off, here for a while, gone in 45 days content".
If that is the case, then no individual expired listing will ever be matched identically with another (unless it happens to be a complete duplicate of the original listing).
This would mean that it would certainly be relevant to send any expired listing to a higher order category page. If your site structure is such that you have a clear heirarchy, then this is very easy to do.
For example:
If your listing URL were something like http://www.mysite.com/listings/home/furniture/couches/couch-i-hate.php, then you can use URL rewrites to strip out the file name and 301 the listing to http://www.mysite.com/listings/home/furniture/couches/, which in most cases will offer a perfectly suitable alternative for the user.
There is another alternative you could consider if you have a search program built in - you could send the traffic to a relevant search. In the above example, mysite.com/search.php?s=couch.
Hope that helps,
Sha
-
We are now doing something similar with our site. We have several thousand products that have been discontinued and didn't think about how much link juice we were throwing away until we got Panda pounded. It's amazing how many things you find to fix when times get tough.
We started with our most popular discontinued products and are 301 redirecting them to either a new equivalent or the main category if no exact match can be found.
We are also going to be reusing the same product pages for annual products instead of creating new pages each year. Why waste all that link juice from past years?
-
If you perform a redirect, I recommend you offer a 301 header response, not a 200. The 301 response will let Google and others know the URL should be updated in their database. Google would then offer the new URL in search results. Additionally any link value can be properly forwarded to the new page.
-
Thanks Ryan,
Massive response! Awesome!
It's interesting that you talk a lot about the 301's.
Are you suggesting this would be far more preferable than simply producing a 200 status code page, listing product choices based on an algorithm - which we currently offer our customers for listings expired less than 45 days?
I suppose, to clarify, I'm worried that if we were to do that (produce 200 status code pages), then crawl equity would be reduced for Google, that we would be wasting a lot of their bandwidth on 200 status pages, when they could be better off crawling and indexing more recent pages.
Whereas with 301's to relevant products as you suggest, we solve that issue.
BTW, our 404 pages offer the usual navigation and search options.
Cheers,
Croozie.
-
Hi Croozie.
The challenge with your site is the volume of pages. Most large sites with 100k+ pages have huge SEO opportunities. Ideally you need a team which can manually review every page of your site to ensure it is optimized correctly. Such a team would be a large expense which many site owners choose to avoid. The problem is your site quality and SEO are negatively impacted.
Whenever a page is removed from your site or otherwise becomes unavailable, a plan should be in place PRIOR to removing the page. The plan should address the simple question: how will we handle traffic to the page whether it is from a search engine or a person who bookmarked the page or a link. The suggested answer is the same whether your site has 10 pages or a million pages:
- if the product is being replaced with a very similar product, or you have a very similar product, then you can choose to 301 the page to the new product. If the product is truly similar, then the 301 redirect is a win for everyone.
Example A: You offer a Casio watch model X1000. You stop carrying this watch and replace it with Casio watch model X1001. It is the same watch design but the new model has a slight variation such as a larger dial. Most users who were interested in the old page would be interested in the new page.
Example B: You offered the 2011 version of the Miami Dolphins T-shirt. It is now 2012 and you have the 2012 version of the shirt which is a different design. You can use a 301 to direct users to the latest design. Some users may be unhappy and want the old design, but it is still probably the right call for most users.
Example You discontinue the Casio X1000 and do not have a very close replacement. You could 301 the page to the Casio category page, or you could let it 404.
The best thing to do in each case is to put on your user hat and ask yourself what would be the most helpful thing you can do to assist a person seeking the old content. There is absolutely nothing wrong with allowing a page to 404. It is a natural part of the internet.
One last point. Be sure your 404 page is optimized, especially considering how many 404s you present. The page should have the normal site navigation along with a search function. Help users find the content they seek.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
18,000 'Title Element is too Long' Errors
How detrimental is this in the overall SEO scheme of things? Having checked 3 of our main competitors, they too seem to have similar issues... I am trying to look at a solution but it is proving very difficult! Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Could this be seen as duplicate content in Google's eyes?
Hi I'm an in-house SEO and we've recently seen Panda related traffic loss along with some of our main keywords slipping down the SERPs. Looking for possible Panda related issues I was wondering if the following could be seen as duplicate content. We've got some very similar holidays (travel company) on our website. While they are different I'm concerned it may be seen as creating content that is too similar: http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/the-wildlife-and-beaches-of-kenya.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/ultimate-kenya-wildlife-and-beaches.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/wildlife-and-beach-family-safari.aspx They do all have unique text but as you can see from the titles, they are very similar (note from an SEO point of view the tabbed content is all within the same page at source level). At the top level of the holiday pages we have a filtered search:
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays.aspx These pages have a unique introduction but the content snippets being pulled into the boxes is drawn from each of the individual holiday pages. I'm just concerned that these could be introducing some duplicating issues. Any thoughts?0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Our main domain has thousands of subdomains with same content (expired hosting), how should we handle it?
Hello, Our client allows users to create free-trial subdomains and once the trial expires, all the domains have the same page. If people stick, their own websites are hosted on the subdomain. Since all these expired trials subdomains have the same content and are linking towards the Homepage, should they be nofollows? Has anyone dealt with something similar? Thanks very much in advance,
Intermediate & Advanced SEO | | SCAILLE0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
My site has multiple H1's, one in the logo image and one as a header. Is there any official stance from the search engines on this?
In doing some research on this issue, I came across this blog post which seems to suggest it certainly will be a trigger to search engines. http://www.seounique.com/blog/multiple-h1-tags-triggers-google-penalty/ Could be a false positive on his specific case, but I was wondering what the community thought. Thanks in advance!
Intermediate & Advanced SEO | | jim_shook0 -
Google swapped our website's long standing ranking home page for a less authoritative product page?
Our website has ranked for two variations of a keyword, one singular & the other plural in Google at #1 & #2 (for over a year). Keep in mind both links in serps were pointed to our home page. This year we targeted both variations of the keyword in PPC to a products landing page(still relevant to the keywords) within our website. After about 6 weeks, Google swapped out the long standing ranked home page links (p.a. 55) rank #1,2 with the ppc directed product page links (p.a. 01) and dropped us to #2 & #8 respectively in search results for the singular and plural version of the keyword. Would you consider this swapping of pages temporary, if the volume of traffic slowed on our product page?
Intermediate & Advanced SEO | | JingShack0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0