1,300,000 404s
-
Just moved a WordProcess site over to a new host and skinned it. Found out after the fact that the site had been hacked - the db is clean.
I did notice at first there were a lot of 404s being generated, so I setup a script to capture and then return a 410 page gone - and then the plan was to submit them to have them removed from the index - thinking there was a manageable number
But, when I looked at Google WebMaster Tools there was over 1,300,000 404 errors - see attachment. My puny attempt to solve this problem seems to need more of an industrial size solution.
My question, is that what would be the best way to deal with this? Not all of the pages are indexed in google - only 637 index but you can only see about 150 in the index. Where bing is another story saying that over 2,700 pages index but only can see about 200.
How is this affecting any future rankings - they do not rank well, as I found out because of very slow page load speed and of course the hacks?
The link profile looking at Google is OK, and there are no messages in Google Webmaster tools.
-
Agree with that, one of our sites has 10 million 404 errors as we deal with a lot of changing content over tens of millions of pages. It doesn't look an increase in 404 errors caused any trouble.
-
According to Google's John Muller
"404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
disavow link more than 100,000 lines
I recieved a huge amount of spamy link (most of them has spam score 100) Currently my disavow link is arround 85.000 lines but at least i have 100.000 more domain which i should add them. All of them are domains and i don't have any backlink in my file. My Problem is that google dosen't accept disavow link which are more than 2MB and showes this message : File too big: Maximum file size is 100,000 lines and 2MB What should i do now?
Technical SEO | | sforoughi0 -
Home has DA 50 but Subpages have Page Authority of 1
Hello, we already asked this, but there was no answer. We would be happy for any information. How could it be, that our subpages all have a PA 1 if the home got DA 50? technical specialities: Megamenue opens on click only Category pages dont exist (home/i-do-not-exist-as-page-category/PA-1-subpage) All subpages have a high amount of links to ressources (over 200) subpages are crawled and online for some time what would be the most obvious cause for the low PA? Would the external link profile be the main reason? thanks in advance. I would be happy to answer your questions
Technical SEO | | brainfruit0 -
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Beating big brands for rankings on Google page 1 post Panda & Penguin
Hi all, so having followed lots of SeoMoz guidelines that we have read here and standard SEO ideas we seem to no longer be able to rank for our core keywords.. and certainly not rank in front of the big brands. We're a small eCommerce company and have historically ranked Google positions 1-4 for many of our keywords (a year or two ago)... but now no where near this any more. We always write unique content for our products of usually around 300-400 words per product we include our keywords in Title, meta description and H1 tags. We include buyers guides and set up articles on the site and generally have a reasonable amount of good quality and always uniquely written content Recently we have concentrated to ensure that page load speed is above average and Google Web Master Tools page speed gives us around 80-90 out of 100 We carry out linking and have always done... in the most recent past this has been weighted towards 'content for links' to gain purely incoming links (although in the early days from 2005 we did swap links with other web masters as well as write and publish on article sites etc). product category pages have an intro piece of text that includes the key phrases for that page and is placed as close to the body tag as possible. From what I understand if you are hit by Panda or Penguin the drop off is invariably over night, but we have not seen this... more of a gradual decline over the last year or two (although there was a bit of a downward blip on Panda update 20). Now we're lucky to be on page 2 for what were our main keywords / phrases such as "portable DVD players" or "portable DVD player"... in front of us in every position is a big national brand.. and certainly on page 1 it is purely only a big brand in every postion. They don't have great info from what we can see for these keywords and certainly don't give as much info as we do. For the phrase "portable DVD player" our portable DVD accessories page ranks better than our actual portable DVD player category page... which we also can't understand? This is our portable DVD category page: http://www.3wisemonkeys.co.uk/portable-dvd-players-car Currently we're starting to produce 2 minute product demo videos for as many of our product detail pages as we can and we plan to host these on something such as Vimeo so that content will be unique to our site (rather than YouTube) in order to give us a different format of unique content on many of our product detail pages to improve rankings (and conversion rates as the same time ideally). So ... I am hoping that some one out there can point us in the right direction and shed some light on our declining positions. Are we doing or have done something wrong... or is it in these post Panda / Penguin days extremely difficult for a small business to beat the big brands as Google believes these are what every one wants to see when shopping? Thanks for any comments and / or help.
Technical SEO | | jasef0 -
Hit hard by EMD update, used to be #1 now not in top 50, what can I do?
We have what I think is a pretty good site, unique articles a few widgets, lots of reviews, decent enough bounce rates and user times (60% and 2:15) based on drupal. Previous updates haven't touched us and an almost identical duplicate (same site compltely different content) of the site targetting a different but related EMD is unaffected which provides a control. I have seen some discussion on it having to do with link profiles. We did pay some backlinkers to link to us, much more on the site that has dropped, and quite a few for a partial match keyword. I'm supposing this is a lot of the issue. If we try and delete these backlinks will it make the situation better or worse? I have also notice some duplicate content warnings in seomoz that weren't there previously. Any ideas?
Technical SEO | | btrr690 -
Website of only circa 20 pages drawing 1,000s of errors?
Hi, One of the websites I run is getting 1,000s of errors for duplicate title / content even though there are only approximately 20 pages. SEOMoz seems to be finding pages that seem to have duplicated themselves. For example a blog page (/blog) is appearing as /blog/blog then blog/blog/blog and so on. Anyone shed some light on why this is occurring? Thanks.
Technical SEO | | TheCarnage0 -
Drupal 1.5 Issue: Taxonomy
Hi there I have a domain which is built in Drupal 1.5 . We managed to redirect all nodes to the actial SEF URL. The one issue we have no is redirecting the taxonomy urls to the SEF url. The obviuos answr is to do a manual 301 redirect n the htaccess file but this will a long process as there are over 500 urls affected. Is there a better way to do this automatically within Drupal? Your thoughts and ideas are welcome.
Technical SEO | | stefanok0