2.3 million 404s in GWT - learn to live with 'em?
-
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this.
Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT.
Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file.
Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis.
So here’s where my thought process is leading:
- Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index.
- Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis.
- Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time.
We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates.
Thoughts? If you think I’m off base, please set me straight.
-
I was actually thinking about some type of wildcard rule in htaccess. This might actually do the trick! Thanks for the response!
-
Hi,
Sounds like you’ve taken on a massive job with 12.5 million pages, but I think you can implement a simple fix to get things started.
You’re right to think about that sitemap, make sure it’s being dynamically updated as the data refreshes, otherwise that will be responsible for a lot of your 404s.
I understand you don’t want to add 2.3 million separate redirects to your htaccess, so what about a simple rule - if the request starts with ^/listing/ (one of your directory pages), is not a file and is not a dir, then redirect back to the homepage. Something like this:
does the request start with /listing/ or whatever structure you are using
RewriteCond %{REQUEST_URI} ^/listing/ [nc]
is it NOT a file and NOT a dir
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
#all true? Redirect
RewriteRule .* / [L,R=301]This way you can specify a certain URL structure for the pages which tend to turn to 404s, any 404s outside of your first rule will still serve a 404 code and show your 404 page and you can manually fix these problems, but the pages which tend to disappear can all be redirected back to the homepage if they’re not found.
You could still implement your 301s for important pages or simply recreate the page if it’s worth doing so, but you will have dealt with a large chunk or your non-existing pages.
I think it’s a big job and those missing pages are only part of it, but it should help you to sift through all of the data to get to the important bits – you can mark a lot of URLs as fixed and start giving your attention to the important pages which need some works.
Hope that helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My "search visibility" went from 3% to 0% and I don't know why.
My search visibility on here went from 3.5% to 3.7% to 0% to 0.03% and now 0.05% in a matter of 1 month and I do not know why. I make changes every week to see if I can get higher on google results. I do well with one website which is for a medical office that has been open for years. This new one where the office has only been open a few months I am having trouble. We aren't getting calls like I am hoping we would. In fact the only one we did receive I believe is because we were closest to him in proximity on google maps. I am also having some trouble with the "Links" aspect of SEO. Everywhere I see to get linked it seems you have to pay. We are a medical office we aren't selling products so not many Blogs would want to talk about us. Any help that could assist me with getting a higher rank on google would be greatly appreciated. Also any help with getting the search visibility up would be great as well.
Intermediate & Advanced SEO | | benjaminleemd1 -
Recent drop in SERP from #1 to #2\.
Our #1 priority keyword, which we've ranked #1 for years, suddenly we're #2. And, the page listed above us doesn't seem to be even compete. Moz On-Page has us at an A and them at a C. When I review the html, I don't even seem any exactly keyword match or matching text on the page. I checked are ranking last week and didn't notice any change - so I've narrowed it down to something changing in the last 4-5 days. Also of note, when I test, We're #1 on mobile, #2 on desktop. Sorry to not list the url's. omitted intentionally. Any thoughts would be much appreciated.
Intermediate & Advanced SEO | | FX4nWOO0 -
Weird rankings... I'm lost & confused...
Hey guys, I've started working with this client a while back. Everything is working perfectly, we create great content, earn links and rank on a lot of interesting terms. Except for one term... It's keyword difficulty isn't even that high: 38%. (We rank on some keywords which have 60%). They got everything right: interesting and engaging content, a diversified backlink profile (with many organic good links), good on-page SEO but nothing can move them up. Some websites with a lower DA & MozRank are outranking them although they don't do anything regarding their SEO except buying scrappy links. The only thing I can see is that they transitioned to HTTPs and that plenty of their links directed to the HTTP domain. The 301 was done during the transition so I don't think we lost too much of link juice... Please note that during the Keyword Difficulty full-report, I noticed that all their metrics are similar to the first few results in SERPs and largely outpace the others... Any idea on what might be happening? Thanks for your help :)!
Intermediate & Advanced SEO | | PierreLechelle0 -
Don't affiliate programs have an unfair impact on a company's ability to compete with bigger businesses?
So many coupon sites and other websites these days will only link to your website if you have a relationship with Commission Junction or one of the other large affiliate networks. It seems to me that links on these sites are really unfair as they allow businesses with deep pockets to acquire links unequitably. To me it seems like these are "paid links", as the average website cannot afford the cost of running an affiliate program. Even worse, the only reason why these businesses are earning a link is because they have an affiliate program; that to me should violate some sort of Google rule about types and values of links. The existence of an affiliate program as the only reason for earning a link is preposterous. It's just as bad as paid link directories that have no editorial standards. I realize the affiliate links are wrapped in CJ's code, so that mush diminish the value of the link, but there is still tons of good value in having the brand linked to from these high authority sites.
Intermediate & Advanced SEO | | williamelward0 -
New Website - Un-natural link warning with 2 weeks of going live
I have a customer who has a website, 8 years old. The business has changed, and he has launched a new website (and sub-business_ to handle a particular service. As such the main website will no longer be handling the new service. For purpose of example; The service in question had it's own are set aside on his website, so what we have done is to 301 that part of the site (a single URL) to the homepage of his new website. Old Business Site
Intermediate & Advanced SEO | | makeusawebsite
Service 1
Services 2 (301 to new site)
Service 3 New Business Site This worked well, and within a week his new site was gaining traffic for the service keyword. However, we have now had a un-natural link wartning in webmaster tools. The old page on the old site had minimal links to it (around 400). It had a page authority of 42, and 142 linking domains. The new website has been live a few weeks now, and has had 3 links to it, all genuine. He was on page one for the new business name, and is now page 6. Has anyone else ever seen this happen, and how should we deal with it. We could of course remove the 301 redirect and put in a recon-request, but the 301 seems like thje right thing to have done, and is genuine. Any advice greatly appreciated.0 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0 -
There's NO reason these sites should be beating mine...Or is there?
Hi Over the past 10 months, my internal page rankings (previously excellent) have plummeted. I'm now trying to recover them. I haven't received an unnatural links warning in Google Webmaster Tools. Also, I used to have hundreds of internal links to each of these 21 pages using the same exact-match anchor text eg, Tuscany real estate, Umbria real estate, etc. I changed this about 6 months ago. So why am I still ranking poorly for these (only moderately competitive keywords) behind sites with poorer metrics? 1) Keyword: lake como real estate My page here – **http://tinyurl.com/d34k8m ** -- used to rank No1 or No2 neck-and-neck with this page www.immobiliarevacanzelago.com/. He's still No1 but I’m down to about No13. Yet when I look in Open Site Explorer virtually all my metrics beat his.
Intermediate & Advanced SEO | | Jeepster0 -
SEO Tools You Can't Live Without?
Hi Guys, I'm currently in the middle of creating a comprehensive blog post covering SEO Tools that I wouldn't be able to work without. So far I've got the following down, as I use these on a day to day basis and they make my job infinitely easier. SEOMoz / OSE AHrefs BuzzStream Scrapebox Xenu / Screaming Frog Excel GWT / Analytics / Adwords Keyword Tool What tools or subscriptions do you use on a daily basis and couldn't be without?
Intermediate & Advanced SEO | | SebastianCowie2