404 or 410 status code after deleting a real estate listing
-
Hi there,
We manage a website which generates an overview and detailpages of listings for several real estate agents.
When these listings have been sold, they are removed from the overview and pages. These listings appear as not found in the crawl error overview in Google Search Console. These pages appear as 404's, would changing this to 410's solve this problem? And if not, what fix could take care of this problem?
-
Good answer Dirk.
I like your idea of adding valuable, relevant content to the pages Dirk, good thinking.
Personally, I'd rather Iet Google know these pages are removed intentionally and not due to errors, so 410 rather than leaving as 40.
One thing to be mindful of, though, is how much crawl budget you're willing to give to these pages. If we're talking about a lot of pages in bulk, I'd be worried how much crawl budget they'd eat up over time. As you point out, they'd likely drop in rank anyway due to loss of internal links too, so might be the cost to the crawl budget isn't worth it?.
Another solution (using your idea Dirk), would be to somehow automate the process of, when a listing is marked as sold, the listing is removed, other properties in the same area are added (as you suggest), then some time later (month or two?), a 410 header set.
The other option would be to 301 the old pages back to the area page for the properties (perhaps with something like a bootstrap message saying the property is sold but others in the area are available). This would pass juice etc back to that page. but, of course, you'd be telling G that the page had permanently moved, which isn't quite the case.
-
The answer from Kristen is correct. However changing 404 to 410 will just let these pages appearing as 410 in the Search Console. The fact that they are appearing is not a problem - it's just that Google wants to notify you that pages return a 4xx status. If this is intended (like in your case) you can just ignore these messages and mark them as fixed.
In your case you could as well consider another option - remove the pages from the listings but keep them published (with status 200). Update the page, indicating that the original property is sold but list some other (similar) properties as an alternative. This way, if there are external pages linking to the property page the link value doesn't get lost and if people would accidentally land on this page they still find content which could be interesting to them (as you remove the navigation links to these pages they become orphans - so little change that they will rank very high in Google)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where does Movie Theater schema markup code live?
What I am trying to accomplishI want what AMC has. When searching google for a movie at AMC near me, Google loads the movie times right onto the top of the first page. When you click the movie time it links to a pop up window that gives you the option to purchase from MovieTickets.com, Fandango or AMC.com.Info about my theaterMy theater hosts theater info and movie time info on their website. Once you click the time you want it takes you to a third party ticket fulfillment site via sub domain that I have little control over. Currently Fandango tickets show up in Google like AMCs but the option to buy on my theater site does not.Questions Generally, how do I accomplish this? Does the schema code get implemented on the third party ticket purchasing site or on my site? How can I ensure that the Google pop-up occurs so that users have a choice to purchase via Fandango or on my theaters website? TSt9g
Intermediate & Advanced SEO | | ColeBField2 -
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
XML Sitemap & Bad Code
I've been creating sitemaps with XML Sitemap Generator, and have been downloading them to edit on my pc. The sitemaps work fine when viewing in a browser, but when I download and open in Dreamweaver, the urls don't work when I cut and paste them in the Firefox URL bar. I notice the codes are different. For example, an "&" is produced like this..."&". Extra characters are inserted, producing the error. I was wondering if this is normal, because as I said, the map works fine when viewing online.
Intermediate & Advanced SEO | | alrockn0 -
What do you do with the page of a product that has been deleted?
As anyone know with an ecommerce website, products are constantly being added and removed. Once products are removed, the corresponding product pages are not reachable. Currently, I am redirecting to the Search page, if a product page is reached, whose corresponding product has been deleted. I am not sure if that is the correct, recommended technique from a SEO perspective. Should I try to show related products on the redirected page? Does anyone here know what is the best thing to do with this product page?
Intermediate & Advanced SEO | | amitramani0 -
Should I literally delete all the articles I published in 2010/2011?
We became a charity in December and redirected everything from resistattack.com to resistattack.org. Both sites weren't up at the same time, we just switched over. However, GWT still shows the .com as a major backlinker to the .org. Why? More importantly, our site just got hit for the first time by an "unnatural link" penalty according to GWT. Our traffic dropped 70% overnight. This appeared shortly after a friend posted a sidewide link from his site that suddenly sent 10,000 links to us. I figured that was the problem, so I asked him to remove the links (he has) and submitted a reconsideration request. Two weeks later, Google refused, saying.. "We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes." We haven't done any "SEO link building" for two years now, but we used to publish a lot of articles to ezinearticles and isnare back in 2010/2011. They were picked up and linked from hundreds of spammy sites of course, none of which we had anything to do with. They are still being taken and new backlinks created. I just downloaded GWT latest backlinks and it's a nightmare of crappy article sites. Should I delete everything from EZA/isnare and close my account? Or just wait longer for the 10,000 links to be crawled and removed from my friends site? What do I need to do about the spammy article sites? Disavow tool or just ignore them? Any other tips/tricks?
Intermediate & Advanced SEO | | TellThemEverything0 -
De-listed? What happened? No rankings??
Was ranking on first page for over 80 terms with over 25 in the top 3 and my main keyword was ranked #4...now my twitter account has replaced my website for #4 in the rankings for the main keyword. I now have ZERO Rankings for all keywords for my website. I have no idea what happened. ** Is there anyway for me to check with Google to see why I have been wiped off the map in the search rankings? ** Boo
Intermediate & Advanced SEO | | Boodreaux0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640