Crawl Errors In Webmaster Tools
-
Hi Guys,
Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers.
I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site.
The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6.
The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:-
In sitemaps : 123
Not Found : 2,079
Restricted by robots.txt 1
Unreachable: 2
I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site.
How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient?
Kind Regards
Neil
-
That's a bit of a pain.
If anything tell them to set up a custom informative 404 error page that will at least direct them somewhere else and not bail from the site.
-
Hi Kieran,
Thank you so much for your answer. The issue is I don't have access to the site admin and can only suggest changes.
Ill suggest what you have put above, and push ahead with the link building and see what happens. Told the client its a competitive niche especially this time of year, my boiler always seems to pack in before winter, does seem to have a settled around page 6, so you may be right.
Just didn't want to rule out if this was a possible penalization from Google.
Kind Regards
Neil
-
Not all errors are bad but for instance if you are getting 404 errors and the actual pages do not exist anymore then you should do 301 redirects on your .htacess to point Google to the right page. Doing this will sort out a lot of the errors. But if you are getting 404s and the pages are there then that is a bigger problem
Restricted by robots is not an error as this is telling you that you don't want somewhere crawled. Check the file to see if it is correct in what it is restricting.
How often are you pushing your sitemap out to Google. If you are as active with pages as your posts suggest I would think about submitting a new one or automating its creation.
If you are actively working the content on the sties there can often be this level of SERPS bouncing.but if you continue with the content I wouldn't worry too much about the errors for the moment and do the housekeeping above. There are very few completely spotless sites out there and the Google Webmaser tools and even the SEOMOZ tools here will always give you some level of errors.
hope this helps
Kieran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google PageSpeed Insights Error
Google PageSpeed Insights showing this "An error occurred while fetching or analyzing the page.Dismiss" URL: http://westpalmbailbonds.com/ What should I do to fix this? Thanks in Advance
Technical SEO | | Beachflower0 -
Internal Link Analysis Tool
I want to get a better handle on what internal link text (and co-occurance if possible) my site currently has. We have a lot of old blog articles that provide link juice back to the main site, but with thousands of pages, we never kept track of when we internally link to a page. Are there any tools that will provide an analysis of this? OpenSiteExplorer seems like a very tedious way to do it and it didn't appear to be 100% accurate. Also, are there any tools that will provide analysis and recommendations based on keywords targeted?
Technical SEO | | TheDude0 -
Site (Subdomain) Removal from Webmaster Tools
We have two subdomains that have been verified in Google Webmaster Tools. These subdomains were used by 3rd parties which we no longer have an affiliation with (the subdomains no longer serve a purpose). We have been receiving an error message from Google: "Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.00%". I originally investigated using Webmaster Tools' URL Removal Tool to remove the subdomain, but there are no indexed pages. Is this a case of simply 'deleting' the site from the Manage Site tab in the Webmaster Tools interface?
Technical SEO | | Cary_PCC0 -
URL Error "NODE"
Hey guys, So I crawled my site after fixing a few issues, but for some reason I'm getting this strange node error that goes www.url.com/node/35801 which I haven't seen before. It appears to originate from user submitted content and when I go to the page it's a YouTube video with no video playing just a black blank screen. Has anyone had this issue before. I think it can probably just be taken off the site, but if it's a programming error of some sort I'd just like to know what it is to avoid it in the future. Thanks
Technical SEO | | KateGMaker0 -
How can I make Google Webmaster Tools see the robots.txt file when I am doing a .htacces redirec?
We are moving a site to a new domain. I have setup an .htaccess file and it is working fine. My problem is that Google Webmaster tools now says it cannot access the robots.txt file on the old site. How can I make it still see the robots.txt file when the .htaccess is doing a full site redirect? .htaccess currently has: Options +FollowSymLinks -MultiViews
Technical SEO | | RalphinAZ
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(www.)?michaelswilderhr.com$ [NC]
RewriteRule ^ http://www.s2esolutions.com/ [R=301,L] Google webmaster tools is reporting: Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.0 -
Crawling a subfolder with a dev site
I am trying to set up a campaign where I am crawling a subfolder of our main site where I have dev version of the new site. However, even though the new site resolves and I have included the full resolving URL but the crawl results come back saying that only one page has been crawled. The site has had a protected block on it for a period of time but this has now been removed. Any ideas? Thanks Nick
Technical SEO | | Total_Displays0