Huge spike in 404s and 500 erros
-
I'm curious what might cause an inordinate amount of 404s in the reporting from SEOMoz's dashboard.
I'm exploring links that are marked as 404s and they are (for the most part) working. I talked with the sysadmin and there were no outages this weekend. We also had a number of 500 errors reported in Webmaster Tools but everything seems to be up.
Any ideas?
-
Maybe submit a support ticket to SEOmoz to see if the 404's might have been false positives.
-
The interesting thing is that SEOMoz threw a bunch of 404s that, overall, are not 404s whereas Webmaster Tools is showing discontinued products which makes sense. We didn't see any outages this weekend so I'm a bit confused.
-
If SEOmoz and Google Webmaster Tools are both reporting errors, I would guess there actually was a problem with your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404s clinging on in Search Console
What is a reasonable length of time to expect 404s to be resolved in Search Console? There was a mass of 404s that were built up from directory changes and filtering URLs that have been fixed. These have all been fixed but of course there are some that slipped the net. How long is it reasonable to expect the old 404s that don't have any links to drop away from Search Console? New 404s are still being reported over 4 months later. 'First detected' is always showing as a date later than the fixed 404's date. Is this reasonable, i've never seen this being so resilient and not clean up like this? We manually fix these 404s and like popcorn more turn up. Just to add the bulk of 404s came into existence around a year ago and left for around 8 months.
Intermediate & Advanced SEO | | MickEdwards0 -
Huge organic drop following new site go live
Hi Guys, I am currently working on a site that's organic traffic suffered ( and is still suffering ) a huge drop in organic traffic. From a consistent 3-400 organic visits a day to almost zero. This happened as soon as the new site went live. I am now digging to find out why. 301s were put in place ( over 2, 500 over them ) and there are still over 1,100 outstanding after review search console this morning. Having looked at the redirect file that was put in place when the new site went live, it all look OK, apart from the redirects look like this... http://www.physiotherapystore.com/ to http://physiotherapystore.com/ Where the new URL is missing www. - I am concerned this is causing a large duplicate issue as both www. and non www. work fine. I am right to have concern or is this something not to worry about?
Intermediate & Advanced SEO | | HappyJackJr0 -
Why the sudden increase in soft 404s?
I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.
Intermediate & Advanced SEO | | EcommerceSite0 -
How bad are 403 errors compared to 404s in regards to technical SEO?
Google Webmaster Tools reports "Access Denied" 403 errors. They also provide an explanation of what they mean at https://support.google.com/webmasters/answer/2409441?ctx=MCE&ctx=AD&hl=en. What are the implications of these Access Denied errors? Should they be 301 redirected internally?
Intermediate & Advanced SEO | | RosemaryB0 -
How does Google index pagination variables in Ajax snapshots? We're seeing random huge variables.
We're using the Google snapshot method to index dynamic Ajax content. Some of this content is from tables using pagination. The pagination is tracked with a var in the hash, something like: #!home/?view_3_page=1 We're seeing all sorts of calls from Google now with huge numbers for these URL variables that we are not generating with our snapshots. Like this: #!home/?view_3_page=10099089 These aren't trivial since each snapshot represents a server load, so we'd like these vars to only represent what's returned by the snapshots. Is Google generating random numbers going fishing for content? If so, is this something we can control or minimize?
Intermediate & Advanced SEO | | sitestrux0 -
Transfer webshop to other domain. Will there be a huge visit/sales drop?
A client of main has a specific domain for their webshop, separately from the brand the domain. The brand domain has much more authority (according to SEOMoz), so the conclusion in an other topic was that it would be better to move the entire webshop to the same domain as the branddomain. Like moving www.webshopdomain.com to www.branddomain.com/webshop Of course all categories and important pages will have a 301 to give through the build up authority. Does anybody have experience with this? I believe in the end this will be much better because all authority will be build up at the same domain.
Intermediate & Advanced SEO | | Seeders
But I am afraid of a drop in the beginning. If there will be a sales drop, I really must give my client notice of this... I hope somebody has done this before..0 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0