How do I prevent 404's from hurting my site?
-
I manage a real estate broker's site on which the individual MLS listing pages continually create 404 pages as properties are sold. So, on a site with 2200 pages indexed, roughly half are 404s at any given time. What can I do to mitigate any potential harm from this?
-
I support Jane's advice here to make a custom 404 that is as beneficial as possible for the user.
I would only worry about 301 redirecting old property pages to their city/neighborhood subcategory if the page shows up in Google Webmaster Tools 404 section and shows an external link pointing at it that is worth saving. A process you could do about once per month or quarter.
-
Property sites use a range of techniques to handle this - I have seen 404s, 410 Gone responses, 302 redirects and 200 OK responses showing a largely blank page (definitely not recommended) whilst browsing a little this morning.
Others leave the listing live but show that it's no longer on the market, e.g. http://www.rightmove.co.uk/property-to-rent/property-29033160.html
It doesn't sound like you can use this last option, although it would allow you to recycle URLs for properties like rentals that often come back on the market.
If you must go with a 404, try to make it useful as Dave says. Can you customise the 404 page, perhaps pulling in information dynamically based upon the listing that was deleted?
-
I'd create a custom 404 page which runs a similar search, whilst you say you can't avoid the 404 what you can do is make a 404 which is useful to both the user and Google
Also make sure that your site no longer lings to old content.
Run Screaming Frog to check those response codes
-
I should have mentioned that I don't have that option. The pages are dynamically added to the site via a plugin which pulls MLS data from the local real estate listing board. (The plugin is dsIDXpress by Diverse Solutions.)
-
You could setup 301 redirects from the sold property URLs to another relevant page, like other properties available in the same neighborhood/town/city. Or possibly even to search result page that contains very similar properties in regards to square footage, bedrooms, baths, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does creating too many parent pages damage my website's SEO?
I need to know how to keep my website structure well organised and ensure Google still recognises the key pages. I work for a travel company which needs to give customers various pieces of information on our website and this needs to be well organised in terms of structure. For example, customers need information on airport pick-ups and drop-offs for each of our destinations but this isn't something that needs to rank on Google. Logically for site structure would be to create a parent page: thedragontrip.com/transfers/india Is creating parent pages for unimportant content a bad idea?
Intermediate & Advanced SEO | | nicolewretham1 -
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Shouldn't Lower Bounce Rate Correlate into Greater Click Thru Rate for a Web Site?
Greetings: I run a real estate web site in New York City with about 650 pages out of which 330 are property listing pages. About 250 of those listing pages contain less than 150 words of content. In late August I set about 250 of the listing pages that generated the least traffic (generally corresponding to those with the least content) to "no-index, follow". Now Google has removed those pages from their index. The overall bounce rate for the site has been reduced from about 69% to about 64% since the removal of these low quality listing pages. However the click thru rate has not improved and is stuck at about 2.2 pages per visitor. Shouldn't the click thru rate improve if the bounce rate goes own? Am I missing something? Also, is a lower bounce rate something that Google will take into account when calculating rank? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
What's better ...more or less linking C-blocks?
I'm a little confused about c-blocks, I've been reading about them but I still don't get it. Are these similar to sitewide links? do they have to come from websites that I own and hosted in the same ip? and finally, what's better ...more or less linking c-blocks? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
Effect SERP's internal 301 redirects?
I'm considering installing Wordpress for my website. So I have to change the static URL's from /webpage.html to /webpage/. Yet I don't want to lose in the SERP's. What should I expect?
Intermediate & Advanced SEO | | wellnesswooz1 -
When Google's WMT shows thousands of links from a single domain... Should they be removed?
Hi, Looking at Google's WMT "links to your site" it shows few sites that have thousands of links pointing to mine. There are actually only 1-2 links pointing to me from a site that Google shows 2000.
Intermediate & Advanced SEO | | BeytzNet
I assume that it is simply because they don't have canonical tags. Should I ask for the 2 links to be removed? Thanks0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0