Massive Increase in 404 Errors in GWT
-
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them.
We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own.
Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there.
Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again.
I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
-
I'm just cynical enough to suspect this may be a byproduct of Google Webmaster Tools recent inbound link meltdown. Huge numbers of GWT users are reporting that their inbound link reports have basically lost most of their links.
What if, in dealing with the problem, Google has gone back to an older version of the links database, which might recover more of the recent links, but also pull back a whack of those links it already discounted?
This is pure speculation on my part, but there's been so much volatility on Google's link reporting recently that I can't say I trust the data as far as I can toss it at the moment.
Have you tired a similar comparison to the data shown in Bing Webmaster Tools?
I'm sure I read of others encountering what you're talking about recently. Will see if I can find the references in case they found anything.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Score Of Site Increased from 1 to 31% overnight
Hello There,
Technical SEO | | rohit.c
Few days back, Spam Score of two of my sites drastically increased from 1% to 31% for one site and from 1% to 36% for another site overnight. Here are the two websites https://simpalm.com/ ducknowl.com Site background for 2-3 months ABC link exchange campaign was started for both the sites. Links where built on high quality sites with DA40 above and most of them are Saas sites No reciprocal link exchange was done Same new pages are created on the site for on page SEO Both the sites are hosted on GoDaddy Can someone please tell me what is the reason of this Sudden increase in spam score of both the sites and at same time?1 -
Webmaster Crawl errors caused by Joomla menu structure.
Webmaster Tools is reporting crawl errors for pages that do not exist due to how my Joomla menu system works. Example, I have a menu item named "Service Area" that stores 3 sub items but no actual page for Service Area. This results in a URL like domainDOTcom/service-area/service-page.html Because the Service Area menu item is constructed in a way that shows the bot it is a link, I am getting a 404 error saying it can't find domainDOTcom/service-area/ (The link is to "javasript:;") Note, the error doesn't say domainDOTcom/service-area/javascript:; it just says /service-area/ What is the best way to handle this? Can I do something in robots.txt to tell the bot that this /service-area/ should be ignored but any page after /service-area/ is good to go? Should I just mark them as fixed as it's really not a 404 a human will encounter or is it best to somehow explain this to the bot? I was advised on google forums to try this, but I'm nervous about it. Disallow: /service-area/*
Technical SEO | | dwallner
Allow: /service-area/summerlin-pool-service.
Allow: /service-area/north-las-vegas
Allow: /service-area/centennial-hills-pool-service I tried a 301 redirect of /service-area to home page but then it pulls that out of the url and my landing pages become 404's. http://www.lvpoolcleaners.com/ Thanks for any advice! Derrick0 -
Sitemap issue - Tons of 404 errors
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/
Technical SEO | | JanetJ0 -
Noticed a lot of duplicate content errors...
how do I fix duplicate content errors on categories and tags? I am trying to get rid of all the duplicate content and I'm really not sure how to. Any suggestions, advice and/or help on this would be greatly appreciated. I did add the canonical url through the SEO Yoast plugin, but I am still seeing errors. I did this on over 200 pages. Thanks for any assistance in advance. Jaime
Technical SEO | | slapshotstudio0 -
404 errors is webmaster - should I 301 all pages?
Currently working on a retail site that shows over 1200 404 errors coming from urls that are from products that were on the site, but have now been removed as they are seasonal/out of stock. What is the best way of dealing with this situation ongoing? I am aware of the fact that these 404s are being marked as url errors in Google Webmaster. Should I redirect these 404s to a more appropriate live page or should I leave them as they are and not redirect them? I am concerned that Google may give the site a penalty as these 404s are growing (as the site is a online retail store and has products removed from its page results regularly). I thought Google was able to recognise 404s and after a set period of time would push them out of the error report. Also is there a tool out there that on mass I can run all the 404s urls through to see their individual page strength and the number of links that point at each one? Thanks.
Technical SEO | | Oxfordcomma0 -
Wordpress Website + 404 Errors
Hi everyone, I like to do a bit of auditing for our clients using SEOMoz. Once client that's using a Wordpress website had reported over a couple hundred 404 errors. However, when checking out the links, all the webpages (that I've tested) loaded just fine. Does anyone know why this would be the case? I thought, perhaps, the website might have gone down when it was crawling, but I have no evidence to back this up.
Technical SEO | | ThinkShiftInc0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Crawl Errors
Okay, I was just in my Google Webmaster Tools and was looking at some of the stats. I have 1354 "not found" pages google says. Many of these URL's are bizarre. I don't know what they are. Others I do know. What should I do about this? Especially all the URL's I don't even know what they are?
Technical SEO | | azguy0