What is the best approach to handling 404 errors?
-
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them.
Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue?
Thanks
Dave
-
Hi Dave
404 errors will happen on website and you dont have to usually worry about them ( unless they are in alarmingly high numbers ) . You only want to worry about 301ing 404 pages when you are losing link juice with those.
I would use these 3 methods to find 404s on the site
-
Like Chris mentioned using Screaming Frog
-
Use your Analytics Package and search for traffic landing on the 404 page
-
Use Google Bing Webmaster Tools and see the 404 message warning ( in Crawl Stats area )
Form here you would want to 301 all valid 404 error pages to the close resembling pages ( that visitors will find useful ).
-
-
I haven't used that one but I just read up on it. It looks good.
-
Thanks for the fast response Chris. Is the best approach to 301 them using a PlugIn like Redirection? Is there a better approach or is there downsides to using a plug in to handle this?
-
Dave, you can use a tool like ScreamingFrog or Xenu's Lunk Sleuth to find links pointing to the 404 pages. You can leave the pages to 404 unless you can see in your stats that search was sending you traffic to those pages or you have external links going to them--in that case you'll want to 301 them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Metrics but Consistently Outranked
I am hoping someone could help us determine why we generally rank quite poorly compared to our competition, despite leading in every single Competitive Metric. We get outranked on a term where the Page Grade gives us an "A", and we best the competitor on each of the metrics. Where would those with more experience suggest we start looking? rank.jpg
Moz Pro | | Yardboy0 -
Is my 404 page set up correctly?
HI there! In using a few different tools, and I have received varied result concerning my 404 page. In WebSite Auditor from Link Assistant, it says my page is set up incorrectly. My webmaster has told me that it is set up correctly, but at this point I'm not entirely sure what's what. Is there another way to test it? I guess i'm not quite sure about the jargon and what "return 404 response code" actually means. Thanks for your help!
Moz Pro | | NutcrackerBalletGifts0 -
Can someone kindly explain what 'Crawl Issue Found: No rel="canonical" Tags' means? Is this a critical error and how can it be rectified?
Can someone kindly explain what 'Crawl Issue Found: No rel="canonical" Tags' means? Is this a critical error and how can it be rectified?
Moz Pro | | JoshMcLean0 -
Issues with Moz producing 404 Errors from sitemap.xml files recently.
My last campaign crawl produced over 4k 404 errors resulting from Moz not being able to read some of the URLs in our sitemap.xml file. This is the first time we've seen this error and we've been running campaigns for almost 2 months now -- no changes were made to the sitemap.xml file. The file isn't UTF-8 encoded, but rather Content-Type:text/xml; charset=iso-8859-1 (which is what Moveable Type uses). Just wondering if anyone has had a similar issue?
Moz Pro | | BriceSMG0 -
Campaign report errors
one of the heavily noted errors for the first crawl of our domain was duplicate titles. I did not see a list of the pages and their current titles, but i am pretty sure it is somewhere to be found. That would help focus the work to do. Am I wrong about that?
Moz Pro | | Jacog0 -
What is the best method to solve duplicate page content?
The issue I am having is an overwhelmingly large number of pages on cafecartel.com show that they have duplicate page content. But when I check the errors on SEOmoz it shows that the duplicate content is from www.cafecartel.com not cafecartel.com. So first of all, does this mean that there are two sites? and is this a problem I can fix easily? (i.e. redirecting the URL and deleting the extra pages) Is this going to make all other SEO useless due to the fact that it shows that nearly every page has duplicate page content? Or am I just completely reading the data wrong?
Moz Pro | | MarkP_0 -
Handling long URLs and overly-dynamic URLs on eCommerce site
Hello Forum, I've been optimizing an eCommerce site and our SEOmoz crawls are favorable for the most part, except for long URLs and overly-dynamic URLs. These issues stem from two URL types: Layered navigation (faceted search) and non-Google internal search results. I outline the issues for each below. We use an SEO-friendly URL structure for our product category pages, but once bots start "clicking" our layered navigation options, all the parameters are appended to our SEO-friendly urls, causing the SEOmoz crawl warnings. Layered Navigation :
Moz Pro | | pano
SEO-Friendly Category Page: oursite.com/shop/meditation-cushions.html Effects of layered navigation: oursite.com/shop/meditation-cushions.html?bolster_material_quality=414&bolsters_appearance=206&color=12&dir=asc&height=291&order=name As you can see the parameters include product attributes and page sorts. I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters. Internal Search Function:
Our URLs start off simple: oursite.com/catalogsearch/result/?q=brown. Then the bot clicks all the layered navigation options, yielding oursite.com/catalogsearch/result/index/?appearance=54&cat=67&clothing_material=83&color=12&product_color=559&q=brown. Also, all search results are set to noindex,follow. My question is: Should we worry about these overly-dynamic and long ULR warnings? We have set up canonical elements, "noindex,follow" solutions, and configured Webmaster Tools to handle our parameters. If these are a concern, how would you resolve these issues?0 -
Dismiss crawl diagnostics error
Hello everyone, Is there a way to dismiss some errors in the Crawl Diagnostics tool so they don't appear again? It happens so that some of the errors are never going to be fixed because of their nature. For example, 'Title too long' errors that point to some of the threads on my forum - it doesn't make sense to change the title of a thread posted by user just for the sake of the error disappearing from the 'Crawl Diagnostics' tool. 🙂 Otherwise the CD interface gets a little bit cluttered with errors which I will never fix anyway. I wonder how others deal with this problem. Thanks.
Moz Pro | | MaratM0