What should I do with a large number of 'pages not found'?
-
One of my client sites lists millions of products and 100s or 1000s are de-listed from their inventory each month and removed from the site (no longer for sale). What is the best way to handle these pages/URLs from an SEO perspective? There is no place to use a 301.
1. Should we implement 404s for each one and put up with the growing number of 'pages not found' shown in Webmaster Tools?
2. Should we add them to the Robots.txt file?
3. Should we add 'nofollow' into all these pages?
Or is there a better solution?
Would love some help with this!
-
I would leave the pages up but mark them as "no follow". When I worked in eCommerce, this was a great tactic. For UX purposes, you could try to steer people to similar-products, but keep the originating page as "no follow" or "no index".
-
Thanks Jane and Lesley for your responses. Great ideas from you both. I think I'll keep the pages but change the content/buying options, as you've both suggested.
I had considered 410s and might fall back on this for historical URLs in the instance that we can no longer retrieve the content.
-
I always take notes from giants on how to handle things like this. Amazon is the giant in this arena, what do they do? They do not disable the product, they leave it on the site as unavailable. I would do the same thing personally. What platform are you using, does it have a suggested products module / plugin? If so, it can be modified to be more promient on pages that are disabled from selling. But I would keep the page and keep the authority of the page.
If you 301 it to another product, the search satisfaction level goes down and your bounce rate will rise. I would be careful with this, because Google wants to serve results that are relevant and what people are looking for.
The other option I would give is to return a 410 status code to get them de-indexed.
-
Hi Claire,
If you really can't 301, consider serving a page providing alternative products, a search function and an explanation of why the page's former content is no longer available. Many estate websites are quite good at this. Using real estate as an example, some maintain the URLs of properties that regularly go on the market (big city apartments, for example) but grey out the information to show a user that the property is not currently for lease. Other URLs will show properties in the former listing's post code.
Your robots.txt file is going to get out of control if you are having to add millions of pages to it on a regular basis, so I would personally not pursue that route.
-
Why aren't 301s an option?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when half of my pages aren't being viewed?
My site is roughly 1000 pages. I've begun refreshing older content. I noticed about half of my pages have no incoming traffic. Should I look at combining some of these pages and 301 redirecting the former links to that new "bigger" page and then having my home page show that new consolidated content? They don't have good back links either. Example layout now: Home Page - Restaurants [show list of cuisines] - User clicks on Italian [show list of all Italian restaurants] - Choice 1 - Choice 2 Even though my main page is seen by about 100,000 people a month, it doesn't seem like anyone is interested in going down that path so none of the restaurants are clicked. How could I improve the user interface/experience and incorporate best Google practices? Thanks, Steve
Technical SEO | | recoil0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
"One Page With Two Links To Same Page; We Counted The First Link" Is this true?
I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/ pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
Technical SEO | | PaddyDisplays
Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)" then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant. I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages. I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Duplicate Page Title for a Large Listing Website
My company has a popular website that has over 4,000 crawl errors showing in Moz, most of them coming up as Duplicate Page Title. These duplicate page titles are coming from pages with the title being the keyword, then location, such as: "main keyword" North Carolina
Technical SEO | | StorageUnitAuctionList
"main keyword" Texas ... and so forth. These pages are ranked and get a lot of traffic. I was wondering what the best solution is for resolving these types of crawl errors without it effecting our rankings. Thanks!0 -
How do I 301 redirect a number of pages to one page
I want to redirect all pages in /folder_A /folder_B to /folder_A/index.php. Can I just write one or two lines of code to .htaccess to do that?
Technical SEO | | Heydarian0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0