404 not found page appears as 200 success in Google Fetch. What to do to correct?
-
We have received messages in Google webmaster tools that there is an increase in soft 404 errors.
When we check the URLs they send to the 404 not found page:
For example, http://www.geographics.com/images/01904_S.jpg
redirects to http://www.geographics.com/404.shtml.
When we used fetch as Google, here is what we got: .
#1 Server Response: http://www.geographics.com/404.shtml
HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
What is wrong and what shall we do?The soft 404 errors are mainly for images that no longer exist in the server.
Thanks!
-
Unfortunately, it doesn't work that way, Alex. Unless the server returns an actual 404 response code in the header, search engines will not consider the page to be an error to be removed from their index. Even though the content of the page may look like an error page, the http response in the header is the only thing that determines whether the engines will treat it as a 404 error.
Paul
-
Your site is not handling "page not found" errors correctly, Madlena. Currently, when a user requests a URL that doesn't exist, your site is sending a 302 (temporary) redirect to an error page, but that error page's header is responding with a 200 status that means "page was found correctly".
You need to have your site's developer change that behaviour so that when a non-existent URL is requested, the server immediately returns a page that has a 404 not found header response (and no 302 redirect). This is necessary even if the page itself looks like a 404 error page.
ONLY by having the page return a 404 response code IN THE HEADER will that tell Google that the page doesn't actually exist and to drop the URL from the index. As Google themselves state - it’s not enough to just create a page that displays a 404 message. You also need to return the correct 404 or 410 HTTP response code.
Once this is configured properly, it's easier to keep track of these 404 errors so that you can fix them by manually 301-redirecting to a suitable page if there is one. (although this may not be the case with just images, it can be very useful in other cases.)
Hope that helps? If anything's not clear, just holler!
Paul
-
There's nothing to worry about. All Google is saying is that the image no longer exists and it's returning a 404 for that specific link "http://www.geographics.com/images/01904_S.jpg". When Google tries to fetch the broken URL, it's getting redirected to the 404 and it's saying that it's ok. We detect a 404 page and that fetches ok. However, the actual URL it was trying (the broken image) is not there. Google will automatically remove the 404s over time and there will be no negative impacts from this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not returning an international version of the page
I run a website that duplicates some content across international editions. These are differentiated by the country codes e.g. /uk/folder/article1/ /au/folder/article1/ The UK version is considered the origin of the content. We currently use hreflang to differentiate content, however there is no actual regional or language variation between the content on these pages. Recently the UK version of a specific article is being indexed by Google as I am able to access via keyword search, however when I try to search for it via: site:domain.com/uk/folder/article1/then it is not displaying, however the AU version is. Identical articles in the same folder are not having this issue. There are no errors within webmaster tools and I have recently refetched the specific URL. Additionally when checking for internal links to the UK and AU edition of the article, I am getting internal links for the AU edition of the article however no internal links for the UK edition of the article. The main reason why this is problematic is because the article is now no longer appearing on the UK edition of the site for internal site search. How can I find out why Google is not getting a result when the URL is entered but it is coming up when doing a specific search?
Technical SEO | | AndDa0 -
Google Sitemap - How Long Does it Take Google To Index?
We have changed our sitemap about 1 month ago and Google is yet to index it. We have run a site: search and we still have many pages indexed but we are wondering how long does it take for google to index our sitemap? The last sitemap we put up had thousands of pages indexed within a fortnight, but for some reason this version is taking way longer. We are also confident that there are no errors in this version. Help!
Technical SEO | | JamesDFA0 -
If my home page never shows up in SERPS but other pages do, does that mean Google is penalizing me?
So my website I do local SEO for, xyz.com is finally getting better on some keywords (Thanks SEOMOZ) But only pages that are like this xyz.com/better_widgets_ or xyz.com/mousetrap_removals Is Google penalizing me possibly for some duplicate content websites I have out there (working on, I know I know it is bad)...
Technical SEO | | greenhornet770 -
301 redirect all 404 pages
Hi I would like to have a second opinion on this. I am working on an ecommerce website that they 301 redirect all 404 pages (including the URLs entered incorrectly) to the “All categories page”. Will this have any negative SEO impact?
Technical SEO | | iThinkMedia0 -
Google Indexed Only 1 Page
Hi, I'm new and hope this forum can help me. I have recently resubmit my sitemap and Google only Indexed 1 Page. I can still see many of my old indexed pages in the SERP's? I have upgraded my template and graded all my pages to A's on SEOmoz, I have solid backlinks and have been building them over time. I have redirected all my 404 errors in .htaccess and removed /index.php from my url's. I have never done this before but my website runs perfect and all my pages redirect as I hoped. My site: www.FunerallCoverFinder.co.za How do I figure out what the problem is? Thanks in Advance!
Technical SEO | | Klement690 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0