Massive Increase in 404 Errors in GWT
-
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them.
We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own.
Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there.
Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again.
I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
-
I'm just cynical enough to suspect this may be a byproduct of Google Webmaster Tools recent inbound link meltdown. Huge numbers of GWT users are reporting that their inbound link reports have basically lost most of their links.
What if, in dealing with the problem, Google has gone back to an older version of the links database, which might recover more of the recent links, but also pull back a whack of those links it already discounted?
This is pure speculation on my part, but there's been so much volatility on Google's link reporting recently that I can't say I trust the data as far as I can toss it at the moment.
Have you tired a similar comparison to the data shown in Bing Webmaster Tools?
I'm sure I read of others encountering what you're talking about recently. Will see if I can find the references in case they found anything.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get rid of bot verification errors
I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
Technical SEO | | mfrgolfgti
Are these really errors?
If so, what can I do to fix them?
If not, can I just tell Moz to ignore them or will this cause bigger problems?0 -
Rebranding: 404 to homepage?
Hello all!
Technical SEO | | JohnPalmer
I did a rebranding, [Domain A] -> [Domain B]. what to do with all the 404 pages? 1. [Domain A (404)] -> [Domain B (homepage)]?
2. [Domain A (404)] -> [Domain B (404 page + same url) - for example: xixix.com/page/bla What do you think ?0 -
404s in GWT - Not sure how they are being found
We have been getting multiple 404 errors in GWT that look like this: http://www.example.com/UpdateCart. The problem is that this is not a URL that is part of our structure, it is only a piece. The actual URL has a query string on the end, so if you take the query string off, the page does not work. I can't figure out how Google is finding these pages. Could it be removing the query string? Thanks.
Technical SEO | | Colbys0 -
404 and re directs from an old design to a new one
I have a directory that I am redesigning. Currently, the old directory is all 404 pages. I need to use the URL's in the old directory for the new one. Should I redirect all the old pages to the new pages? Or is it better to delete the old pages, let them de index - so that I can use those same URLs? I really need help with this.
Technical SEO | | SwanJob0 -
301 redirect all 404 pages
Hi I would like to have a second opinion on this. I am working on an ecommerce website that they 301 redirect all 404 pages (including the URLs entered incorrectly) to the “All categories page”. Will this have any negative SEO impact?
Technical SEO | | iThinkMedia0 -
Why do I get duplicate page title errors.
I keep getting duplicate page title errors on www.etraxc.com/ and www.etraxc.com/default.asp, which are both pointing to the same page. How do i resolve this and how bad is it hurting my SEO.
Technical SEO | | bobbabuoy0 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0