60,000 404 errors
-
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place.
Thanks
-
Hi there
Check your sitemap and update your internal links. This usually helps takes care of a major portion of your problem.
From there, check your backlinks and make sure you update those that are relevant to your site and remove those that are not.
I would take a look at the 404 list - see what pages you could easily redirect to relevant pages the user would still find valuable, and custom 404 the others. You could also 410 pages, but I would use that wisely, because it's not really that necessary.
Lastly, here's a great resource from Matt Cutts on SEW that helps SEOs handle eCommerce 404s.
Hope this helps! Good luck!
-
I had similar issues on large e-commerce websites, where these pages were not in navigation, but were in Google's index, so Webmaster Tools reported tens of thousands of 404s.
Keeping this many 301 redirections would have put a large load on the server, so we made sure that navigation and site search doesn't link to these pages and later Google removed them from its index.
On the other hand, Peter is right. It depends on the ratio of live and broken pages. If you have 100k pages and 60k is 404, then Google will most likely ignore your website's crawling and indexing of new pages. You need to have a very dodgy website to trigger a Panda penalty, though. 404 happens all the time and Google is quite patient with the website admins to fix it.
-
Yes - large scale 404 can trigger Panda:
http://themoralconcept.net/pandalist.htmlBut don't 301 them. Just fix links from source pages where clicking lead to 404 - replace them with other pages similar to missing one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
Error got removed by request
hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query
Technical SEO | | innovative.rohit0 -
404 Hurricane Update Page After?
Hi All, I am wondering if anyone could help me decide how I should go about handling a page i plan on removing and could possibly use later on. So, a perfect example is: Let's say a company in Florida posted a page about the stores hours and possibly closing due to the incoming hurricane. Once the hurricane passes and the store is reopened, should I 404 that page since another hurricane could come after? The url for the company is www.company.com/hurricane so this is a url that we would want to use again. I guess we could just 410 and name each url www.company.com/hurricane-irma & www.company.com/hurricane-jose for each new hurricane. I am just wonder what is the best practice for a situation like this. Thanks for the help!
Technical SEO | | aua0 -
404 errors
Hi I am getting these show up in WMT crawl error any help would be very much appreciated | ?ecaped_fragment=Meditation-find-peace-within/csso/55991bd90cf2efdf74ec3f60 | 404 | 12/5/15 |
Technical SEO | | ReSEOlve
| | 2 | mobile/?escaped_fragment= | 404 | 10/26/15 |
| | 3 | ?escaped_fragment=Tips-for-a-balanced-lifestyle/csso/1 | 404 | 12/1/15 |
| | 4 | ?escaped_fragment=My-favorite-yoga-spot/csso/5598e2130cf2585ebcde3b9a | 404 | 12/1/15 |
| | 5 | ?escaped_fragment=blog/c19s6 | 404 | 11/29/15 |
| | 6 | ?escaped_fragment=blog/c19s6/Tag/yoga | 404 | 11/30/15 |
| | 7 | ?escaped_fragment=Inhale-exhale-and-once-again/csso/2 | 404 | 11/27/15 |
| | 8 | ?escaped_fragment=classes/covl | 404 | 10/29/15 |
| | 9 | m/?escaped_fragment= | 404 | 10/26/15 |
| | 10 | ?escaped_fragment=blog/c19s6/Page/1 | 404 | 11/30/15 | | |0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Google Plus Places Error
We have a large amount of clients and when we are updating their Google Plus Places listing, the ad is still presented as active, however it is not live on Google and when you click to view the listing we get this message 'We currently do not support this location' I have researched this and found many people are having this issue, but no solutions as of yet. Can anyone shed some light on to this because some of our clients are not thrilled at the moment. Thanks Jon
Technical SEO | | Jon_bangonline0 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20