1,300,000 404s
-
Just moved a WordProcess site over to a new host and skinned it. Found out after the fact that the site had been hacked - the db is clean.
I did notice at first there were a lot of 404s being generated, so I setup a script to capture and then return a 410 page gone - and then the plan was to submit them to have them removed from the index - thinking there was a manageable number
But, when I looked at Google WebMaster Tools there was over 1,300,000 404 errors - see attachment. My puny attempt to solve this problem seems to need more of an industrial size solution.
My question, is that what would be the best way to deal with this? Not all of the pages are indexed in google - only 637 index but you can only see about 150 in the index. Where bing is another story saying that over 2,700 pages index but only can see about 200.
How is this affecting any future rankings - they do not rank well, as I found out because of very slow page load speed and of course the hacks?
The link profile looking at Google is OK, and there are no messages in Google Webmaster tools.
-
Agree with that, one of our sites has 10 million 404 errors as we deal with a lot of changing content over tens of millions of pages. It doesn't look an increase in 404 errors caused any trouble.
-
According to Google's John Muller
"404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google displaying "Items 1-9" before the description in the Search Results
We see our pages coming up in Google with the category page/product numbers in front of our descriptions. For example: Items 1 - 24 of 86 (and than the descriptions follows). Our website is magento based. Is there a fix for this that anyone knows of? Is there method of stopping Google from adding this on to the front of our Meta Description?
Technical SEO | | DutchG0 -
Could using our homepage Google +1's site wide harm our website?
Hello Moz! We currently have the number of Google +1's for our homepage displaying on all pages of our website. Could this be viewed as black hat/manipulative by Google, and result in harming our website? Thanks in advance!
Technical SEO | | TheDude0 -
301 Redirect for 3 Domains into 1 New Domain
So I wanted a quick sanity check on the htaccess syntax for migrating 3 domains into 1 new domain. For example, we're migrating 3 sites abc.com, def.com and ghi.com, all into 1 new site on ghi.com. Here's the htaccess we're placing on the root of ghi.com: redirect 301 http://www.abc.com/wines.html http://www.ghi.com/wines redirect 301 http://www.def.com/trade.html http://www.ghi.com/trade
Technical SEO | | cmaseattle
redirect 301 http://www.ghi.com/winery-tours.html http://www.ghi.com/visit/taste On the DNS side of things, we're parking abc.com and def.com on the ghi.com server. I'm not seeing examples of htaccess files for this scenario, and none that use any domain info on the "from" side of the redirect 301 syntax. Any suggestions before we pull the trigger? Thanks!0 -
Webmaster Tools finding phantom 404s?
We recently (three months now!) switched over a site from .co.uk to .com and all old urls are re-directing to the new site. However, Google Webmaster tools is flagging up hundreds of 404s from the old site and yet doesn't report where the links were found, i.e. in the 'Linked From' tab there is no data and the old links are not in the sitemap. SEOmoz crawls do not report any 404s. Any ideas?
Technical SEO | | Switch_Digital0 -
I have 15,000 pages. How do I have the Google bot crawl all the pages?
I have 15,000 pages. How do I have the Google bot crawl all the pages? My site is 7 years old. But there are only about 3,500 pages being crawled.
Technical SEO | | Ishimoto0 -
What is the most likely reason we aren't ranking #1 for our keyword.
So we are targeting a keyword and we are ranking 2nd for it. Another company is ranking number 1. What is the best element to target for us to improve into position number one? Page authority: them 41, us 40. mozRank: them 5.52, us 3.38. mozTrust: them 5.86, us 5.58. mT/mR: them 1.1, us 1.4. Total Links: them 6571, us 68. Internal Links: them 1138, us 1. External Links: them 5431, us 63. Followed Links: them 6569, us 64. Nofollowed Links: them 2, us 4. Linking Root Domains: them 25, us 41. Broadkeyword usage in page title: them YES, us YES. KW in domain: them no, us partial. Exact anchor test links: them 161, us 21. % of links with exact anchor text: them 2%, us 30%. Linking Root domains with exact anchor text: them 2, us 11. Domain Authority: them 41, us 40. Domain MozRank: them 3.7, us 4.5. Domain MozTrust: them 3.8, us 4.5. External links to domain: them 22574, us 217. Linking root domains: them 50, us 48. Linking C-blocks: them 46, us 42. Tweets: them 1, us 12. FB shares: them 6, us 26.
Technical SEO | | Benj250 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0 -
Still no pagerank after 1 year
i am workin on a site www.progazon.ca, it's been up for almost a year now 11 months and still no pagerank. Is there something i can do. does it change something. thx
Technical SEO | | martinLachapelle0