18 404 errors on pages that are actually fine.
-
Hi,
I just used the compain tool to look for errors on my site and it appears that seomoz crawler finds 18 404 errors on pages that are fine in my good.
I do proceed with a URL rewritting on those pages, but navigation is fine.
Some of the pages are:
http://cassplumbingtampabay.com/about-us
http://cassplumbingtampabay.com/commercial-services
http://cassplumbingtampabay.com/drain-cleaning-repair
...
Does anybody know what's going on?
-
Hi Alex,
you're welcome I'm very happy to help. I've been there too where client does not want to change something that you know is for the better. It is crazy they pay us and ignore our advice.
If it's with Go Daddy I can almost guarantee you it's a hosting issue. They easily have somebody doing something eating up the rest of the resources on the server at the time and mistakenly give 404's
for hosting non-WordPress sites I love fire host, data pipe, pair networks and for WordPress WP engine, web synthesis, Zippy kid, press labs and Pagely
if guy ever changes his mind which I understand will probably never happen. However if you're ever looking for a good hosting service I've had the best of luck with all of those.
Good talking to.
Sincerely,
Thomas
-
Wow! I didn't expect so much help!
I did do the internetmarketingninjas tool and it found a bunch of error but it appeared it didn't really know how to crawl my links.
Thank you for the screaming frog tool that I did not know about and seems to work good. Almost everything seems to be fine with it.
I did have some major changes on the site in the past due to a webmaster change.
Regarding the hosting, it's with Godaddy, I am not too happy with it, but the site belongs to a customer and he loves it and refuses to change it. (I also have no control over the design or content...).
Anyway all is good.
I surely appriciate all you help. Thank you again.
-
Hi Alex,
you have nothing to worry about I foolishly did a much more expansive search than what was needed to figure out this issue you are simply a victim of a glitch with Roger bot
I ran the correct tool on Internet marketing ninjas your 404's are 200's
I verified this using screaming frog as well as the correct Internet marketing ninjas tool link below. I would suggest using the link below to check any suspect links it gives you the very large amount of information.
-
hi Alex,
I Believe you might be having some trouble with the bot actually been able to crawl the site properly.
Screaming frog showed that your site had no issues when it came to your about us link
as you can see it gives you a great deal of information on the website in addition to that it links to some other great tools. However I believe it's either an issue with the hosting or something with Roger bot got messed up because you're in the clear for this and I would have to manually put in the others to check them. As screaming frog only goes up to 500 URLs per a free account.
| URL |
http://cassplumbingtampabay.com/about-us
|
| Status Code | 200 |
| Status | OK |
| Type | text/html; charset=UTF-8 |
| Size | 21551 |
| Title | About Us - Cass Plumbing - Tampa |
| Level | 1 |
| In Links | 28 |
| Out Links | 55 |I will continue to look at your site and evaluate it with other tools.
If I can give you a tip you might want to try using Raven tools they give you 30 days for free no credit card needed. It incorporates a lot of excellent SEO tools and would be a good way to check your site to make sure there's no serious issues in addition to what we have done.
Please understand I think SEOmoz is the best, but any tool can have a problem I would do a double check and try out Raven tools because of the conflicting information.
sincerely,
Thomas
-
Hi Alex,
I was thinking the exact same thing it would be a glitch until I just did a 2nd test on some software that I know is accurate and I encourage you to type your URL into this as well as double check it on screaming frog spider SEO
It appears to me that there has been a big change in your website at some point?
from the information I've gathered using Internet marketing ninjas free tool they are a company recommended by SEOmoz and are highly regarded for their quality tools. it shows quite a few 404's
however one thing that just crossed my mind is you could be having server issues. Is your web host a quality host? you may have said structure issues as well. I don't mean to freak you out, and I'm sorry if I'm heading that direction. But let's look at all the information and try to figure this out.
http://www.internetmarketingninjas.com/seo-tools/google-sitemap-generator/
this is an another tool used by SEO's like distilled and I trust it quite a bit. Here is the link you can download it for windows, Mac and Linux is free to check up to 500 pages and is a very valuable tool in my opinion
http://www.screamingfrog.co.uk/seo-spider/
So you know right now and it shows this as the stats at the top. However I can see all the links that it's giving 404's and other Errors to.
this result is from Internet marketing ninjas below.
<a>Internal Pages</a>
362<a>External Links</a>
29<a>Internal Redirects</a>
8<a>External Redirects</a>
6<a>Internal Errors</a>
274<a>External Errors</a>
1I hope this sheds some light on things please let me know if you have had any site work done recently?
Sincerely,
Thomas
-
Thank you for your help, it does look fine too, was just wondering is somehow the url rewritting I did wasn't good for search engines.
Here is a screenshot.
Maybe that's just an seomoz minor glitch.
Thank you again.
Alex
-
I could tell you much better if you would post a screenshot would that be possible?
I have gone to your 1st link, and you are correct there is no issue with that link.
Not even a /
if you're using WordPress do you have the force an / on?
I will look at your source code as well.
Hope I've been of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Score is 100%
I have added a page for a particular product, sodium bisulfate, Page Optimization Score is totally green 100%, what should I do, either to sit back and see the results? http://waqaschemicals.com/site/sodium-bisulphate-bisulfate-from-turkey/ I typically do not understand while using the Moz that exactly line of action do I have to follow for to bring the good result for the keyword "sodium bisulfate"? How much does a google takes time to get your 100% things on the first page ? I'm not good in SEO but few knowledge step by step What is the best things to do for good SEO.
Moz Pro | | waqaspuri0 -
Htaccess and robots.txt and 902 error
Hi this is my first question in here I truly hope someone will be able to help. It's quite a detailed problem and I'd love to be able to fix it through your kind help. It regards htaccess files and robot.txt files and 902 errors. In October I created a WordPress website from what was previously a non-WordPress site it was quite dated. I had built the new site on a sub-domain I created on the existing site so that the live site could remain live whilst I created on the subdomain. The site I built on the subdomain is now live but I am concerned about the existence of the old htaccess files and robots txt files and wonder if I should just delete the old ones to leave the just the new on the new site. I created new htaccess and robots.txt files on the new site and have left the old htaccess files there. Just to mention that all the old content files are still sat on the server under a folder called 'old files' so I am assuming that these aren't affecting matters. I access the htaccess and robots.txt files by clicking on 'public html' via ftp I did a Moz crawl and was astonished to 902 network error saying that it wasn't possible to crawl the site, but then I was alerted by Moz later on to say that the report was ready..I see 641 crawl errors ( 449 medium priority | 192 high priority | Zero low priority ). Please see attached image. Each of the errors seems to have status code 200; this seems to be applying to mainly the images on each of the pages: eg domain.com/imagename . The new website is built around the 907 Theme which has some page sections on the home page, and parallax sections on the home page and throughout the site. To my knowledge the content and the images on the pages are not duplicated because I have made each page as unique and original as possible. The report says 190 pages have been duplicated so I have no clue how this can be or how to approach fixing this. Since October when the new site was launched, approx 50% of incoming traffic has dropped off at the home page and that is still the case, but the site still continues to get new traffic according to Google Analytics statistics. However Bing Yahoo and Google show a low level of Indexing and exposure which may be indicative of the search engines having difficulty crawling the site. In Google Analytics in Webmaster Tools, the screen text reports no crawl errors. W3TC is a WordPress caching plugin which I installed just a few days ago to speed up page speed, so I am not querying anything here about W3TC unless someone spots that this might be a problem, but like I said there have been problems re traffic dropping off when visitors arrive on the home page. The Yoast SEO plugin is being used. I have included information about the htaccess and robots.txt files below. The pages on the subdomain are pointing to the live domain as has been explained to me by the person who did the site migration. I'd like the site to be free from pages and files that shouldn't be there and I feel that the site needs a clean up as well as knowing if the robots.txt and htaccess files that are included in the old site should actually be there or if they should be deleted... ok here goes with the information in the files. Site 1) refers to the current website. Site 2) refers to the subdomain. Site 3 refers to the folder that contains all the old files from the old non-WordPress file structure. **************** 1) htaccess on the current site: ********************* BEGIN W3TC Browser Cache <ifmodule mod_deflate.c=""><ifmodule mod_headers.c="">Header append Vary User-Agent env=!dont-vary</ifmodule>
Moz Pro | | SEOguy1
<ifmodule mod_filter.c="">AddOutputFilterByType DEFLATE text/css text/x-component application/x-javascript application/javascript text/javascript text/x-js text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon application/json
<ifmodule mod_mime.c=""># DEFLATE by extension
AddOutputFilter DEFLATE js css htm html xml</ifmodule></ifmodule></ifmodule> END W3TC Browser Cache BEGIN W3TC CDN <filesmatch ".(ttf|ttc|otf|eot|woff|font.css)$"=""><ifmodule mod_headers.c="">Header set Access-Control-Allow-Origin "*"</ifmodule></filesmatch> END W3TC CDN BEGIN W3TC Page Cache core <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteRule .* - [E=W3TC_ENC:_gzip]
RewriteCond %{HTTP_COOKIE} w3tc_preview [NC]
RewriteRule .* - [E=W3TC_PREVIEW:_preview]
RewriteCond %{REQUEST_METHOD} !=POST
RewriteCond %{QUERY_STRING} =""
RewriteCond %{REQUEST_URI} /$
RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|w3tc_logged_out|wordpress_logged_in|wptouch_switch_toggle) [NC]
RewriteCond "%{DOCUMENT_ROOT}/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" -f
RewriteRule .* "/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" [L]</ifmodule> END W3TC Page Cache core BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress ....(((I have 7 301 redirects in place for old page url's to link to new page url's))).... #Force non-www:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC]
RewriteRule ^(.*)$ http://domain.co.uk/$1 [L,R=301] **************** 1) robots.txt on the current site: ********************* User-agent: *
Disallow:
Sitemap: http://domain.co.uk/sitemap_index.xml **************** 2) htaccess in the subdomain folder: ********************* Switch rewrite engine off in case this was installed under HostPay. RewriteEngine Off SetEnv DEFAULT_PHP_VERSION 53 DirectoryIndex index.cgi index.php BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /WPnewsiteDee/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /subdomain/index.php [L]</ifmodule> END WordPress **************** 2) robots.txt in the subdomain folder: ********************* this robots.txt file is empty **************** 3) htaccess in the Old Site folder: ********************* Deny from all *************** 3) robots.txt in the Old Site folder: ********************* User-agent: *
Disallow: / I have tried to be thorough so please excuse the length of my message here. I really hope one of you great people in the Moz community can help me with a solution. I have SEO knowledge I love SEO but I have not come across this before and I really don't know where to start with this one. Best Regards to you all and thank you for reading this. moz-site-crawl-report-image_zpsirfaelgm.jpg0 -
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
Find pages containing broken links.
hi everyone, for each internal broken links I need to find all the pages that contain it. In the Seomoz report there is only a refferer link for each broken link, but google webmaster tools indicates that the dead link is present in many pages of the site. there is a way to have these data with SEOmoz or other software, in a csv report ? thanks
Moz Pro | | wwmind0 -
One page per campaign?
Not quite sure if I read correctly, but is it correct that one campaign tracks only one page of my site? So if I wanted to track something like a services page, this would require a second campaign?
Moz Pro | | GroundFloorSEO0 -
Redirecting duplicate .asp pages??
Hi all, I have a bit of a problem with duplicate content on our website. The CMS has been creating identical duplicate pages depending on which menu route a user takes to get to a product (i.e. via the side menu button or the top menu bar). Anyway, the web design company we use are sorting it out going forward, and creating 301 redirects on the duplicate pages. My question is, some of the duplicates take two different forms. E.g. for the home page: www.<my domain="">.co.uk
Moz Pro | | gdavies09031977
www..<my domain="">.co.uk/index.html
www.<my domain="">.co.uk/index.asp</my></my></my> Now I understand the 'index.html' page should be redirected, but does the 'index.asp' need to be directed also? What makes this more confusing is when I run the SEOMoz diagnostics report (which brought my attention to the duplicate content issue in the first place - thanks SEOMoz), not all the .asp pages are identified as duplicates. For example, the above 'index.asp' page is identified as a duplicate, but 'contact-us.asp' is not highlighted as a duplicate to 'contact-us.html'? I'm a bit new to all this (I'm not a IT specialist), so any clarification anyone can give would be appreciated. Thanks, Gareth0 -
Status 404-pages
Hi all, One of my websites has been crawled by SEOmoz this week. The crawl showed me 3 errors: 1 missing title and 2 client errors (4XX). One of these client errors is the 404-page itself! What's your suggestion about this error? Should a 404-page have the 404 http status? I'd like to hear your opinion about this one! Thanks all!
Moz Pro | | Partouter0 -
On-Page Summary (Report Cards) automation?
Hi everyone, Under the "On-Page" tab which shows your report cards, is there a way of getting it to grade your entire site? One of my site's is only ~20 pages so it's no big deal to manually enter each URL and set each one to update weekly. But what if I have a site that has ~1,000 pages and I want to optimise each and every page for my main keyword using the report cards feature? Thanks in advance! 🙂 Ash
Moz Pro | | AshSEO20110