Managing 404 errors
-
What is the best way to manage 404 errors for pages that are no longer on the server.
For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample
Is there anything else I can do?
-
Thanks!
I've got one in place
Example:
I fairly sure its set up correctly
-
If possible you can list pages that have similar URLs on your 404 page. Some CMSs can help you do this. WordPress certainly comes to mind.
-
Also, be sure to have a user friendly 404 page. 404 is unavoidable due to typos, silliness and random acts of God, so it's always wise to have a highly functional page as a catchall for anything that you can't 301 redirect.
Examples
http://www.apple.com/gljasdlj
http://pages.ebay.com/gljasdlj
http://www.cnn.com/gljasdlj -
Thank again Barry, Very Helpful!
-
301 redirect them to their new page location.
EDIT: To clarify, there are probably some links coming in to those pages or there are new page equivilents that could better serve customers.
If there's definitely no match then I'd still consider redirecting them to the home page (or even a custom landing page, rather than the custom 404 page) to preserve as much link juice as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang Tags with Errors in Google Webmaster Tools
Hello, Google Webmaster tools is giving me errors with Hreflang tags that I can't seem to figure out... I've double checked everything: all the alternate and canonical tags, everything seems to match yet Google finds errors. Can anyone help? International Targeting | Language > 'fr' - no return tags
Intermediate & Advanced SEO | | GlobeCar
URLs for your site and alternate URLs in 'fr' that do not have return tags.
Status: 7/10/15
24 Hreflang Tags with Errors Please see attached pictures for more info... Thanks, Karim KQgb3Pn0 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
New domain purchase 301 and 404 issues. Please help!
We recently purchased www.carwow.com and 301 redirected the site to www.carwow.co.uk (our main domain). The problem is that carwow.com had URLs indexed like www.carwow.com/a-b-c the 301 sends them to carwow.co.uk/a-b-c which obviously doesn't exist so is a 404! What should be done in this situation? Should it be ignored and not re-directed at all, or is there a way to delete/disavow these dead pages? An SEO has advised we redirect all pages to the homepage, but won't that mess up the link profile? Any advice would be great!
Intermediate & Advanced SEO | | JamesPursey0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
404 ? 301 ? What is your opinion ?
Hi, I have a classifieds website and I am wondering about the life of a page with an ad. An announcement has therefore a limited life, so : Is a 404 pages? a 301 redirect to the section? let the content without redirection? What is your opinion? Sorry for my english, i'm french 😉 Thanks. A.
Intermediate & Advanced SEO | | android_lyon0 -
Www vs. non-www differences in crawl errors in Webmaster tools...
Hey All, I have been working on an eCommerce site for a while that to no avail, continues to make me want to hang myself. To make things worth the developers just do not understand SEO and it seems every change they make just messes up work we've already done. Job security I guess. Anywho,most recently we realized they had some major sitemap issues as almost 3000 pages were submitted by only 20 or so were indexed. Well, they updated the sitemap and although all the pages are properly indexing, I now have 5000+ "not found" crawl errors in the non-www version of WMT and almost none in the www version of the WMT account. Anyone have insight as to why this would be?
Intermediate & Advanced SEO | | RossFruin0 -
Anyone managed to decrease the "not selected" graph in WMT?
Hi Mozzers. I am working with a very large E-com site that has a big issue with duplicate or near duplicate content. The site actually received a message in WMT listing out pages that Google deemed it should not be crawling. Many of these were the usual pagination / category sorting option URL issues etc. We have since fixed the issue with a combination of site changes, robots.txt, parameter handling and URL removals, however I was expecting the "not selected" graph in WMT to start dropping. The number of roboted pages has increased by around 1 million pages (which was expected) and indexed pages has actually increased despite removing hundreds of thousands of pages. I assume this is due to releasing some crawl bandwidth for more important pages like products. I guess my question is two-fold; 1. Is the "not selected" graph cumulative, as this would explain why it isn't dropping? 2. Has anyone managed to get this figure to significantly drop? Should I even care? I am relating this to Panda by the way. Important to note that the changes were made around 3 weeks ago and I am aware not everything will be re-crawled yet. Thanks,
Intermediate & Advanced SEO | | Further
Chris notselected.jpg0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0