Managing 404 errors
-
What is the best way to manage 404 errors for pages that are no longer on the server.
For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample
Is there anything else I can do?
-
Thanks!
I've got one in place
Example:
I fairly sure its set up correctly
-
If possible you can list pages that have similar URLs on your 404 page. Some CMSs can help you do this. WordPress certainly comes to mind.
-
Also, be sure to have a user friendly 404 page. 404 is unavoidable due to typos, silliness and random acts of God, so it's always wise to have a highly functional page as a catchall for anything that you can't 301 redirect.
Examples
http://www.apple.com/gljasdlj
http://pages.ebay.com/gljasdlj
http://www.cnn.com/gljasdlj -
Thank again Barry, Very Helpful!
-
301 redirect them to their new page location.
EDIT: To clarify, there are probably some links coming in to those pages or there are new page equivilents that could better serve customers.
If there's definitely no match then I'd still consider redirecting them to the home page (or even a custom landing page, rather than the custom 404 page) to preserve as much link juice as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google robots.txt test - not picking up syntax errors?
I just ran a robots.txt file through "Google robots.txt Tester" as there was some unusual syntax in the file that didn't make any sense to me... e.g. /url/?*
Intermediate & Advanced SEO | | McTaggart
/url/?
/url/* and so on. I would use ? and not ? for example and what is ? for! - etc. Yet "Google robots.txt Tester" did not highlight the issues... I then fed the sitemap through http://www.searchenginepromotionhelp.com/m/robots-text-tester/robots-checker.php and that tool actually picked up my concerns. Can anybody explain why Google didn't - or perhaps it isn't supposed to pick up such errors? Thanks, Luke0 -
Soft 404 error for a big, longstanding 301-redirected page
Hi everyone, Years ago, we acquired a website that had essentially 2 prominent homepages - one was like example.com and the other like example.com/htm... They served the same purpose basically, and were both very powerful, like PR7 and often had double listings for important search phrases in Google. Both pages had amassed considerable powerful links to them. About 4 years ago, we decided to 301 redirect the example.com/htm page to our homepage to clean up the user experience on our site and also, we hoped, to make one even stronger page in serps, rather than two less strong pages. Suddenly, in the past couple weeks, this example.com/htm 301-ed page started appearing in our Google Search Console as a soft 404 error. We've never had a soft 404 error before now. I tried marking this as resolved, to see if the error would return or if it was just some kind of temporary blip. The error did return. So my questions are:
Intermediate & Advanced SEO | | Eric_R
1. Why would this be happening after all this time?
2. Is this soft 404 error a signal from Google that we are no longer getting any benefit from link juice funneled to our existing homepage through the example.com/htm 301 redirect? The example.com/htm page still has considerable (albeit old) links pointing to it across the web. We're trying to make sense of this soft 404 observation and any insight would be greatly appreciated. Thanks!
Eric0 -
Google Seeing 301 as 404
Hi all, We recently migrated a few small sites into one larger site and generally we had no problems. We read a lot of blogs before hand, 301'd the old links etc and we've been keeping an eye on any 404s. What we have found is that Webmaster is picking up quite a few 404s, yet when we investigate these 404s they are 301'd and work fine. This isn't for every url, but Google is finding more and I just want to catch any problems before they get out of hand. Is there any reason why Google would count a 301 as a 404? Thanks!
Intermediate & Advanced SEO | | HB170 -
Sitemap error
Hey Guys Everytime I run the tester through google webmaster tools - I keep getting an error that tells me "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." An idea how to go about fixing this without changing the site around? https://www.zenory.co.nz/sitemap I have seen competitors sitemaps look similar to mine. Cheers
Intermediate & Advanced SEO | | edward-may0 -
Webmaster Tools "Not found" errors after sitemap update
Hello Mozzers - I found a sitemap with loads of URL errors on it (none of the URLs on sitemap actually existed) so I went ahead and updated sitemap - now I'm seeing a spike in "not found" errors in WMT - is this normal / anything to worry about when you significantly change a sitemap. I've never replaced every URL on a sitemap before! L
Intermediate & Advanced SEO | | McTaggart0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Question regarding error url while checking in open-site explorer tool
Hello friends, My website home url & inner page url shows error while checking the open-site site explorer tool from SEOMoz, for a website** eg.www.abc.com website as**
Intermediate & Advanced SEO | | zco_seo
**"Oh Hey! It looks like that URL redirects to www.abc.com/error.aspx?aspxerrorpath=/default.aspx. Would you like to see data for that URL instead?"**May I know the reason, why this url showing this result while checking back link report from the tool?**May I know on what basis this tool is evaluating the website url as well?****May I know, Will this affect the Google SERPs for this website?**Thanks0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0