Managing 404 errors
-
What is the best way to manage 404 errors for pages that are no longer on the server.
For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample
Is there anything else I can do?
-
Thanks!
I've got one in place
Example:
I fairly sure its set up correctly
-
If possible you can list pages that have similar URLs on your 404 page. Some CMSs can help you do this. WordPress certainly comes to mind.
-
Also, be sure to have a user friendly 404 page. 404 is unavoidable due to typos, silliness and random acts of God, so it's always wise to have a highly functional page as a catchall for anything that you can't 301 redirect.
Examples
http://www.apple.com/gljasdlj
http://pages.ebay.com/gljasdlj
http://www.cnn.com/gljasdlj -
Thank again Barry, Very Helpful!
-
301 redirect them to their new page location.
EDIT: To clarify, there are probably some links coming in to those pages or there are new page equivilents that could better serve customers.
If there's definitely no match then I'd still consider redirecting them to the home page (or even a custom landing page, rather than the custom 404 page) to preserve as much link juice as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
SSL Cert error
Just just implemented SSL with a wild card cert and I got an email from google that my non-www cert is not valid. Any ideas ? SSL/TLS certificate does not include domain name https://electrictime.com/ To: Webmaster of https://electrictime.com/, Google has detected that the current SSL/TLS certificate used on <a>https://electrictime.com/</a> does not include <a>https://electrictime.com/</a> domain name. This means that your website is not perceived as secure by some browsers. As a result, many web browsers will block users accessing your site by displaying a security warning message. This is done to protect users’ browsing behavior from being intercepted by a third party, which can happen on sites that are not secure.
Intermediate & Advanced SEO | | ThomasErb0 -
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
Can't diagnose this 404 error
Hi Moz community I have started receiving a load of 404 errors that look like this: This page: http://paulminors.com/blog/page/5/ is linking to: http://paulminors.com/category/podcast/paulminors.com which is a broken link. This is happening with a load of other pages as well. It seems that "paulminors.com" is being added to the end of the linking pages URL.I'm using Wordpress and the SEO by Yoast plugin. I have searched for this link in the source of the linking page but can't find it, so I'm struggling to diagnose the problem. Does anyone have any ideas on what could be causing this? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao0 -
404 in Google cache for one of my blog posts
Hey Moz People, I'm getting a 404 when I cache: this blog post http://www.inscopix.com/blog/decoding-brain-initiative and I'm not able to see what's causing it. Can someone take a look and let me know if they see anything standing out? Thanks,
Intermediate & Advanced SEO | | jacobfy0 -
How to fix Invalid Product Page registering as Soft 404
Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?
Intermediate & Advanced SEO | | ntsupply0 -
Significant Google crawl errors
We've got a site that continuously like clockwork encounters server errors with when Google crawls it. Since the end of last year it will go a week fine, then it will have two straight weeks of 70%-100% error rate when Google tries to crawl it. During this time you can still put the URL in and go to the site, but spider simulators return a 404 error. Just this morning we had another error message, I did a fetch and resubmit, and magically now it's back. We changed servers on it in Jan to Go Daddy because the previous server (Tronics) kept getting hacked. IIt's built in html so I'm wondering if it's something in the code maybe? http://www.campteam.com/
Intermediate & Advanced SEO | | GregWalt1 -
How to prevent 404's from a job board ?
I have a new client with a job listing board on their site. I am getting a bunch of 404 errors as they delete the filled jobs. Question: Should we leave the the jobs pages up for extra content and entry points to the site and put a notice like this job has been filled, please search our other job listings ? Or should I no index - no follow these pages ? Or any other suggestions - it is an employment agency site. Overall what would be the best practice going forward - we are looking at probably 20 jobs / pages per month.
Intermediate & Advanced SEO | | jlane90