Huge spike in 404s and 500 erros
-
I'm curious what might cause an inordinate amount of 404s in the reporting from SEOMoz's dashboard.
I'm exploring links that are marked as 404s and they are (for the most part) working. I talked with the sysadmin and there were no outages this weekend. We also had a number of 500 errors reported in Webmaster Tools but everything seems to be up.
Any ideas?
-
Maybe submit a support ticket to SEOmoz to see if the 404's might have been false positives.
-
The interesting thing is that SEOMoz threw a bunch of 404s that, overall, are not 404s whereas Webmaster Tools is showing discontinued products which makes sense. We didn't see any outages this weekend so I'm a bit confused.
-
If SEOmoz and Google Webmaster Tools are both reporting errors, I would guess there actually was a problem with your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hacked Wordpress Site! So many 404s
So I had a site that I worked on get hacked. We eliminated the URLs, found the vulnerability (Bluehost!) and rolled back the site. BUT they got into the Google Search Console and indexed a LOT of pages. These pages are now 404 errors and I asked the robots.txt file to make them noindex. The problem is that Google is placing a "this site may be hacked" on the search listing. I asked Google to reevaluate it and it was approved by there are still 80,000 404 errors being shown and it still believes that the uploaded files that we deleted should be showing. Doing a site search STILL shows the infected pages though and it has been a month. Any insight would definitely be helpful. Thanks!
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
500 and 508 pages?
Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Joomla to Wordpress site migration - thousands of 404s
I recently migrated a site from Joomla to Wordpress. In advance I exported the HTML pages from Joomla using Screaming Frog and did 301 redirects on all those pages. However Webmaster Tools is now telling me (a week after putting the redirects in place) that there are >7k 404s. Many of them aren't HTML pages, just index.php files but I didn't think I would have to export these in my Screaming Frog crawl. We have since done a blanket 301 redirect for anything with index.php in it but Webmaster Tools is still picking them up as 404s. So my question is, what should I have done with Screaming Frog re exporting to ensure I captured all pages to redirect and what should I now do to fix the 404s that Webmaster Tools is picking up?
Intermediate & Advanced SEO | | Bua0 -
How does Google index pagination variables in Ajax snapshots? We're seeing random huge variables.
We're using the Google snapshot method to index dynamic Ajax content. Some of this content is from tables using pagination. The pagination is tracked with a var in the hash, something like: #!home/?view_3_page=1 We're seeing all sorts of calls from Google now with huge numbers for these URL variables that we are not generating with our snapshots. Like this: #!home/?view_3_page=10099089 These aren't trivial since each snapshot represents a server load, so we'd like these vars to only represent what's returned by the snapshots. Is Google generating random numbers going fishing for content? If so, is this something we can control or minimize?
Intermediate & Advanced SEO | | sitestrux0 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
301s Creating Soft 404s in GWT
Hi, We re-did a section of a site and got rid of hundreds of pages of no longer relevant content. We 301'd the urls to the category homepage. Now, GWT calls these soft 404s. a) Should we have done something differently instead of 301ing? b) Are these hundreds of soft 404 errors a big problem or threat to how Google sees us for SEO? c) Should we correct this in some way? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
New Site Structure and 404s. Should I redirect everything?
Hi fellow Mozzers, I've recently re-released a site and also took the opportunity to change/clean up the URL structure. As a result of this Google is starting to report many 404s such as below; blog/tag/get-fit/ blog/tag/top-presents/ Most of these 404 errors are from tag or category pages which simply don't exist any more (because they were unnecessary, crap or irrelevant). Although there's also a few posts I've removed. My question is whether it's worth redirecting all these tags and pages to the root directory of the site's new blog (as there isn't really a new page which is similar or appropriate) or just to leave them as 404 errors. Bearing in mind; They don't really rank for anything There's little if any links pointing to these pages Thanks.
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
$1,500 question
I have $1,500 to spend to promote 8 years old website. Almost no SEO work was done for the site in the past 3-4 years. The site has a couple hundreds (around 300) external backlinks pointing to the homepage, and around 30 backlinks pointing to internal pages. It gets around 60% traffic from referring sites, 30% direct, and 10% from SE. The homepage has PR 4. It ranks around 70th place in Google rankings for one of the main keywords. No keyword research has been done for the site. Looking for long term benefits. What would be the best way, in your opinion, to spend this money?
Intermediate & Advanced SEO | | _Z_0