Webmaster Tool Errors
-
Do Google webmaster errors affect your search engine ranking for a site?
-
THINK MY CONNECTION TIMED OUT AND WAS POSTED TWICE OOPS
This is my original message:
Do you mean crawl errors in Google Webmaster Tools? The most serious of these issues is the not found 404 errors - If you have a lot of these how you need to look at it is that some of your content is not accessible to Googlebot and as such you are going to affect rankings - if Google can't find your content it can't index it. Also if you have a lot of pages that are not found in Google Webmaster tools this will lead to poor user experience, frustration and a higher bounce rate for your site, which in turn will impact on your websites rankings. You need to remember Googlebot will only have a certain amount of time to crawl and index your site, if lots of this time is spent crawling and reporting errors less of your pages will be indexed and therefore you will have less pages ranking for terms in Google, once again impacting your rankings. Also if you have lots of pages that have 404 not found errors then you are potentially losing link juice if any links are pointing to these pages and they haven't been 301 redirected to tell Google of the new permanent location of the webpage.
Timed out errors also indicate that the pages listed have taken too long for Google to load, so it has given up. Once again check pages listed here and see if you can optimize them. Since Panda has come in site speed has become a factor in ranking.
Unreachable errors are those that are a result of internal server and DNS problems. Obviously these are serious as Google is having problems accessing parts of your site and once again if you don't intend to deny Google access to these pages then these errors are limiting Google from indexing these pages listed.
Good crawlability is an essential factor in user experience and search engine rankings, so take advantage of this excellent tool to make yours better if possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Missing xml tag error
Our xml sitemap is divided up in to many smaller xml sitemaps so we have fewer products per sitemap, in order to easily identify errors. A couple of weeks ago, we changed our xml sitemap by reordering some of the products. However, this has left some old xml sitemaps without any data, and they are no longer appearing in our xml sitemap. But, Google is still identifying these sitemaps since they once existed, and they are giving errors since they can't locate them. Should we 404 those xml sitemaps, or is there a better way to handle this?
Technical SEO | | ang0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Soft 404 errors
Hello Everyone, I recently removed some pages and made a custom 404 page by putting "ErrorDocument 404 http://www.site.com/404.htm" in the htaccess file but WMT now reports soft 404 errors, how do I do this properly? Thanks
Technical SEO | | jwdl0 -
600+ 404 Errors: Best Practice for Redirects?
Hi All, I've just checked my GWMT profile for one of my client's sites and found that there are currently over 600 404 Error notifications! This is not that surprising given that we very recently redesigned and launched their new corporate site, which previously had a ton of "junk" legacy pages. I was wondering if it would work in terms of efficient SEO to simply apply a 301 redirect from the 404 page to our root to solve this issue? If not what would be a good solution? Thanks in advance for all your great advice!
Technical SEO | | G2W1 -
403 error
Hey guys, I know that a 403 is not a terrible thing, but is it worth while fixing? If so what is the best way to approach it. Cheers
Technical SEO | | Adamshowbiz0 -
Should we redirect 404 errrors seen in webmaster tools with ... (dot.dot,dot) ?
Lately I have seen lots of 404 errors showing in webmaster tools that are not really links. Many of them from shammy pages. (I did not put them there) One of the most common types is ones that show the link ending in ... ( dot, dot, dot) The appearance of the link is being sent from pages like this http://www.the-pick.com/00_fahrenheit,2.html For example a link like this would show up in webmaster tools as a 404 error. http://www.ehow.com/how_2352088_easily-... Are these worth redirecting? So far I have redirected some of them and found that is was not helpful and possibly harmful. Anyone else had the same experience? Also getting lots of partial urls showing up from pages that reference my site but the url is cut off and the link is not active. Does Google really count these as links? Is redirecting a link from a spammy page acknowledging acceptance and could it count against you?
Technical SEO | | KentH0 -
Google Crawler Error / restricting crawling
Hi On a Magento Instance we manage there is an advanced search. As part of the ongoing enhancement of the instance we altered the advance search options so there are less and more relevant. The issue is Google has crawled and catalogued the advanced search with the now removed options in the query string. Google keeps crawling these out of date advanced searches. These stale searches now create a 500 error. Currently Google is attempting to crawl these pages twice a day. I have implemented the following to stop this:- 1. Submitted requested the url be removed via Webmaster tools, selecting the directory option using uri: http://www.domian.com/catalogsearch/advanced/result/ 2. Added Disallow to robots.txt Disallow: /catalogsearch/advanced/result/* Disallow: /catalogsearch/advanced/result/ 3. Add rel="nofollow" to the links in the site linking to the advanced search. Below is a list of the links it is crawling or attempting to crawl, 12 links crawled twice a day each resulting in a 500 status. Can anything else be done? http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=128&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=115&category=55&color_layered=130&csize[0]=0&fabric=0&inventry_status=97&length=116&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=126&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=135&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=10%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=122&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10
Technical SEO | | Flipmedia1120 -
Broken Inner Links - Tool Recommendations?
Do you have any recommendations for tools that scan an entire website and report broken inner links? I run several UGC centered websites and broken inner links, and external, is an issue. Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
Technical SEO | | uderic0