How do I prevent 404's from hurting my site?
-
I manage a real estate broker's site on which the individual MLS listing pages continually create 404 pages as properties are sold. So, on a site with 2200 pages indexed, roughly half are 404s at any given time. What can I do to mitigate any potential harm from this?
-
I support Jane's advice here to make a custom 404 that is as beneficial as possible for the user.
I would only worry about 301 redirecting old property pages to their city/neighborhood subcategory if the page shows up in Google Webmaster Tools 404 section and shows an external link pointing at it that is worth saving. A process you could do about once per month or quarter.
-
Property sites use a range of techniques to handle this - I have seen 404s, 410 Gone responses, 302 redirects and 200 OK responses showing a largely blank page (definitely not recommended) whilst browsing a little this morning.
Others leave the listing live but show that it's no longer on the market, e.g. http://www.rightmove.co.uk/property-to-rent/property-29033160.html
It doesn't sound like you can use this last option, although it would allow you to recycle URLs for properties like rentals that often come back on the market.
If you must go with a 404, try to make it useful as Dave says. Can you customise the 404 page, perhaps pulling in information dynamically based upon the listing that was deleted?
-
I'd create a custom 404 page which runs a similar search, whilst you say you can't avoid the 404 what you can do is make a 404 which is useful to both the user and Google Also make sure that your site no longer lings to old content.
Run Screaming Frog to check those response codes
-
I should have mentioned that I don't have that option. The pages are dynamically added to the site via a plugin which pulls MLS data from the local real estate listing board. (The plugin is dsIDXpress by Diverse Solutions.)
-
You could setup 301 redirects from the sold property URLs to another relevant page, like other properties available in the same neighborhood/town/city. Or possibly even to search result page that contains very similar properties in regards to square footage, bedrooms, baths, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need 301's if I use HSTS in HTTP to HTTPS migration?
Just wondering if this was strong enough signal to search engines that we don't need to write a 301 rule in .htaccess.
Intermediate & Advanced SEO | | KevinBudzynski0 -
Site still indexed after request 'change of address' search console
Hello, A couple of weeks ago we requested a change of address in Search console. The new, correct url is already indexed. Yet when we search the old url (with site:www.) we find that the old url is still indexed. Is there another way to remove old urls?
Intermediate & Advanced SEO | | conversal0 -
Re: Inbound Links. Whether it's HTTP or HTTPS, does it still go towards the same inbound link count?
Re: Inbound Links. If another website links to my website, does it make a difference to my inbound link count if they use http or https? Basically, my site http://mysite.com redirects to https://mysite.com, so if another website uses the link http://mysite.com, will https://mysite.com still benefit from the inbound links count? I'm unsure if I should reach out to all my inbound links to tell them to use my https URL instead...which would be rather time consuming so just checking http and https counts all the same. Thanks.
Intermediate & Advanced SEO | | premieresales0 -
Can I have multiple 301's when switching to https version
Hello, our programmer recently updated our http version website to https. Does it matter if we have TWO 301 redirects? Here is an example: http://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/linux-dedicated-server We're getting pulled in two different directions. I read https://moz.com/blog/301-redirection-rules-for-seo and don't know if 2 301's suffice. Please let me know. Greatly appreciated!
Intermediate & Advanced SEO | | Shawn1240 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
10,000+ links from one site per URL--is this hurting us?
We manage content for a partner site, and since much of their content is similar to ours, we canonicalized their content to ours. As a result, some URLs have anything from 1,000,000 inbound links / URL to 10,000+ links / URL --all from the same domain. We've noticed a 10% decline in traffic since this showed up in our webmasters account & were wondering if we should nofollow these links?
Intermediate & Advanced SEO | | nicole.healthline0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0 -
Sitemap - % of URL's in Google Index?
What is the average % of links from a sitemap that are included in the Google index? Obviously want to aim for 100% of the sitemap urls to be indexed, is this realistic?
Intermediate & Advanced SEO | | stats440