Seek help correcting large number of 404 errors generated, 95% traffic halt
-
Hi, The following GWT screen tells a bit of the story:
site: http://bit.ly/mrgdD0
http://www.diigo.com/item/image/1dbpl/wrbp
On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT.
My traffic had been steady at about 1000 clicks/day.
At midnight on 2/10, google traffic completely halted, down to 11 clicks/day.
I submitted a recon request and was told 'no manual penalty'
Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then.
By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th.
I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors!
I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors.
However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'.
I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$.
Assuming that the large number of 404 internal errors is the reason for the sudden shutoff...
How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'..
Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP??
thanks
-
Hello Rand, I've been facing a similar problem with my site. I'd really appreciate your response here - http://www.seomoz.org/q/help-fixing-the-traffic-drop-that-started-on-4-september-2012.
-
I wouldn't feel too confident that the numbers and dates Google's showing you are precise or accurate. In fact, we've seen times when GWMT is considerably off. I'd watch how Google crawls your site and look at search traffic to your pages - those are likely leading indicators that things are/will be fixed.
-
Thanks for the replies guys - - I had run Xenu on the site and it found no broken links... but still GWT error count continues to climb, and as of today
Google released a MUCH improved timeline view for the error count --- problem is, it's still showing 58K errors as of yesterday and climbing, long after I fixed them - and it wont show me where it thinks the source is...
These errors are all on internal pages BTW..
Heres the new google view
http://awesomescreenshot.com/0ef1gy6c7
The new GUI also includes a way to mark errors 'fixed' -- one by one!! I need to mark 60 thousand at once!
Also I can see the date these errors started appearing and it just doesnt make sense given that is the day my traffic started reappearing as well..
-
I agree with Rand's suggestions. I just ran a Screaming Frog crawl of the whole site on 10,233 links, 8997 URLs and got no 404s. So I think it's pretty safe to assume you've fixed the 404 issue. Here's the output of the crawl in case you'd like it for a reference: http://www.sendspace.com/file/7zui0v
I'd say:
- Definitely clean up and resubmit your XML sitemap
- Double check your backlink profile with Open Site Explorer and MajesticSEO to be sure that there aren't sites linking to URLs that no longer exist. If you find any of these make sure to 301 redirect them. Just take all the target URLs and dump them into Screaming Frog in list mode. All the links from OSE point to your homepage so they are not an issue, I don't have access to Majestic right now so I couldn't run those for you.
- You can now Submit pages in Google Webmaster Tools as well in the Fetch as Googlebot section. So you may consider submitting some of the new pages the site generates in addition to your reconsideration request to help get Google to re-crawl and find the 404s are gone.
Good luck man and please let us know if nothing changes after you implement these fixes.
-Mike
-
Hi Mark - wow, sounds really rough. I've got a few suggestions:
- First off, you need to make 100% sure that you've actually fixed the issue and that the internal links are pointing to the right places AND any old URLs that may have had internal/external links are either rel=canonicaling or 301 redirecting to the correct, updated locations.
- You might try using a few tools to verify this, including the SEOmoz Crawl Test http://pro.seomoz.org/tools/crawl-test and Screaming Frog: http://www.screamingfrog.co.uk/seo-spider/
- When you are ready, submit new XML Sitemaps to Google with the proper URLs. Make sure you've deleted/removed your old ones.
- You can also send the reconsideration request again, indicating that while you're aware this isn't a penalty, you have realized some technical/navigation issues on the site and believe you've now fixed these.
Hope this helps and wish you the best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Best Practices
Hello All, So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools. I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to. Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools) So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper. Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer. So my two questions are: 1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage. 2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site). Any help would be appreciated, thanks
Technical SEO | | CleanEdisonInc0 -
Sitemap do they get cleared when its a 404
Hi, Sitemap do they get cleared when its a 404. We have a drupal site and a sitemap that has 60K links and i want to know if in these 4 years we deleted 100's of links and do they have them automatically cleared from Sitemap or we need to build the sitemap again? Thanks
Technical SEO | | mtthompsons0 -
404 Errors & Redirection
Hi, I'm working with someone who recently had two websites redesigned. The old permalink structure consisted of domain/year/month/date/post-name. Their developer changed the new permalink structure to domain/post-name, but apparently he didn't redirect the old URLs to the new ones so we're finding that links from external sites result in 404 errors (once I remove the date in the URL, the links work fine). Each site has 3-4 years worth of blog posts, so there are quite a few that would need to be changed. I was thinking of using the Redirection plugin - would that be the best way to fix this sitewide on both sites?Any suggestions would be appreciated. Thanks, Carolina
Technical SEO | | csmm0 -
Robots.txt - What is the correct syntax?
Hello everyone I have the following link: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I want to prevent google from indiexing everything that is related to "view=send_friend" The problem is that its giving me dublicate content, and the content of the links has no SEO value of any sort. My problem is how i disallow it correctly via robots.txt I tried this syntax: Disallow: /view=send_friend/ However after doing a crawl on request the 200+ dublicate links that contains view=send_friend is still present in the CSV crawl report. What is the correct syntax if i want to prevent google from indexing everything that is related to this kind of link?
Technical SEO | | teleman0 -
Are my Canonical Links set up correctly?
I have Enable Canonical Links (recommended) on my web site. However, I also have THIS checked: Enable full URL for Home Page Canonical Link (include /default.asp) Is it hurting me??? Keep getting dinged on our report card. We are using the Volusion shopping cart software/platform.
Technical SEO | | GreenFarmParts0 -
302 redirects help
Hello, I have a Magento ecommerce store and my latest seomoz crawl report shows over 10,000 302 redirects. An example is this below https://jsshirts.com.au/wishlist/index/add/product/160/ which is being redirected to http://jsshirts.com.au/enable-cookies Can anyone please help me with a solution to help with this problem. Many thanks
Technical SEO | | mullsey0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0