Www vs non www - Crawl Error 902
-
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help.
The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away.
I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing.
I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version.
Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution?
Many thanks,
Ben.
-
Glad to hear thats all fixed! Though i will say thats a very slow response time for any development / hosting company typically i would expect a maximum response time of 8 hours.. We try to keep it under 2 heh.
But yes, glad thats working for you now
-
The problem was as you anticipated, after spending a few days chasing the party who actually look after it im pleased to say i checked this morning and all is working as expected.
Thanks very much for your help Toby!
-
Thanks Toby, Ive emailed off, I expect a reply to be a couple of days away (what it normally take them) Thanks for the help thus far and ill message back when they do!
-
In that case that would probably be the best place to start. If you want any evidence for a missing A record, heres a DNS checking tool (it currently throws an error because it can't find an A record).
Let me know what they say
-
I dont have access to the DNS, my access is limited to the Magento CMS, and ftp access to the root folder.
I guess its a case of popping an email off to the guys who built the site and control the web space and getting them to address the issue?
-
Ok, so the file is working, thats a good start!
Looks like we need to go back a step in the request process then. Do you have access to the DNS settings for the http://atp-instrumentation.co.uk domain? If so, please could you check if there is an A record set for it? (looks to be registered through Civica UK Ltd - Whois Report)
What i suspect might be the case is that you're missing an A record for atp-instrumentation.co.uk but that there is one set for the www.atp-instrumentation.co.uk.
I've run a couple of tests against the domain DNS and i get nothing back for the non-www address, which is what suggests that we're not even making it as far as your servers.
To set the A record, you'll be looking for something in your control panel for 'DNS settings' or maybe 'Host Records', you should see in there ether an option to select A record settings, or perhaps a dropdown with things like A, AAA, CNAME etc. You need to:
- select 'A'
- In the domain box type: atp-instrumentation.co.uk
- In the IP box, type: 82.118.110.42
Hopefully that makes sense. If your at all unsure, let me know and i'll do what i can to help more specifically. Domain control panels are so different for each provider its difficult to provide direct instructions without knowing what your panel looks like
-
I added the line and it did indeed break the website, got internal service error etc
So it seems the file is working
-
Just a note here, the Redirects do seem to be working, so it looks like the .htaccess file IS being loaded.
Please check anyway, its possible that the redirects are comming from another location if they have been set elseware as well. From a (very quick) look, the file seems to be formatted correctly so no obvious reason for the www redirects to not be working...
-
Hmm ok,
before i do anything else, we need to make sure that the .htaccess file is actually being loaded. To do that, we need to break it for a second.
at the top of the htaccess file, put something like (specifics dont matter here)
THISisInValid666
What we're trying to do is put some invalid text into the htaccess file so that the site breaks when it loads. The idea being that we can confirm that the .htaccess file is actually being used. So if you put that in and the site throws a 500 error (when navigating to it with or without www) we can confirm that changes we make should work.
If the site continues to load without issue then we know that the .htaccess file isnt in use so we need to look at server configureation directly (specifically the AllowOverrides settings)
Once you have confirmed if it does / doesnt break, remove the line again
-
Thanks Toby, here is the entire HTACCESS file with your fix implemented. Doesn't seem to have worked if i go to
http://atp-instrumentation.co.uk with no www it still fails to load
Edited out to shorten convo
-
You're correct, you can make it a little more generic though, without seeing all of your .htaccess file, try this:
Replace:
RewriteCond %{HTTP_HOST} ^companyname.co.uk [NC]
RewriteRule ^(.*)$ http://www.companyname.co.uk/$1 [L,R=301]With:
RewriteCond %{HTTP_HOST} !^www. [NC]
RewriteRule ^ http%{ENV:protossl}://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]This is what could be called a wildcard redirect in that a direct copy paste should work for you with no need to edit. (you dont have to manually add in the correct domain name)
What it does:
- First it checks to see if the requested url has a www in it
- if it does -not- it then runs the rule, otherwise it ignores it.
- The rule first checks for http or https. Then adds in the www. followed by the domain and tld, finally adding the URI (/somepage/page for example).
- the L in square brakets means do not process anything else in the htaccess file
- the R=301 means that it will be a 301 (perminant) redirect.
If that still doesnt work for you, paste up your full .htaccess file, or you can send it to me directly if you'd rather and i'll take another look
-
Thanks Highland,
How do I go about changing this? I believe its to do with the .htacess file.
The website was developed in Magento via an external company who monitor it. Looking in the root folder I can see the htaccess file but it contains a lot lines of code and rewrites that I dont fully understand.
These lines are the one I think could be relevant so far.
############################################
enable rewrites
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^companyname.co.uk [NC]
RewriteRule ^(.*)$ http://www.companyname.co.uk/$1 [L,R=301]Then there is lots of category pages etc that are 301 redirected
Follow by
RewriteRule ^home http://www.companyname.co.uk/ [R=301,L]Then some more redirects for pages
I know this is specific but is this editable a different way in Magento? Thanks for any help offered i know this is getting more technical
-
You're 100% right. You should have one 301 redirect to the other. While there are some SEO reasons for this (mainly with duplicate content), the best reason is that it's just less confusing to end users to only have one URL to use. If your non-www has trouble loading I would say you need a 301 to the www version.
Moz restricts Top Level Domain (i.e. domain.com) and crawls accordingly. I have some set up with www.domain.com and some with just domain.com. The 301 to the www forces the issue but Moz is smart enough to crawl the proper pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I disallow crawl of my Job board?
MOZ crawler is telling me we have loads of duplicate content issues. We use a Job Board plugin on our Wordpress site and we have allot of duplicate or very similar jobs (usually just a different location), but the plugin doesn't allow us to add any rel canonical tags to the individual jobs. Should I disallow the /jobs/ url in the robots.txt file? This will solve the duplicate content issue but then Google wont be able to crawl any of the individual job listings Has anyone had any experience working with a job board plugin on Wordpress and had a similar issue, or can advise on how best to solve our duplicate content?? Thanks 🙂
Technical SEO | | O2C0 -
404 crawl errors ending with your domain name??
Hello, I have a crawl test with numerous 404 errors ending with my domain name..? Not sure what the cause is. Plugins? Ecommerce? I use Wordpress if that could lead to an answer. Thanks for your time. K
Technical SEO | | Hydraulicgirl0 -
Product Code Error in Volusion
I started working with about 800+ 404 errors in September after we migrated our site to Volusion 13. There is a recurring 404 error that I can't trace inside of our source code or in our Sitemap. I don't know what is causing this error so I have no way of knowing how to fix it. Tech support at Volusion has been less than helpful so any feed back would be appreciated. | http://www.apelectric.com/Generac-6438-Guardian-Series-11kW-p/{1} | The error is seemingly starting with the product code. The addendum at the end of the URL "p/" should be followed by the product code. In this example, 6438. Instead, the code is being automatically populated with %7B1%7D Has anyone else this issue with Volusion or does this look familiar across any other platform?
Technical SEO | | MonicaOConnor0 -
Bogus Crawl Errors in Webmaster Tools?
I am suddenly seeing a ton of crawl errors in webmaster tools. Almost all of them are URL links coming from scraper sites.that I do not own. Do you see these in your Webmaster Tools account? Do you mark them as "fixed" if they are on a scraper site? There are waaaay too many of these to make redirects. Thanks!
Technical SEO | | EGOL0 -
Strange 404 Error(Answered)
Hi everyone! I recently took over a new account and I was running an initial crawl on the site and a weird 404 error popped up. http://www.directcolors.com/products/liquid-colored-antique/top
Technical SEO | | rblake
http://www.directcolors.com/applications/concrete-antiquing/top
http://www.directcolors.com/applications/concrete-countertops/top I understand that the **top **could be referring to an actual link that brings users to the top of a page, but on these pages there is no such link. Am I missing something?1 -
How Often is Site Crawled
Good morning- I saw some errors in my first crawl and immediately removed the pages from my website. I then re-created my XML sitemap and uploaded to Google. The question I have is will the site be crawled to recognize the changes in the next day or so? The pages were just placed on the site as test pages and never removed. The initial crawl that notified me it was done found the errors and were removed. Thanks for your help. Peter
Technical SEO | | VT_Pete0 -
Using a non-visible H1
I have a developer that wants to use style="text-indent:-9999px" to make the H1 non-visible to the user. Being the conservative person I am, I've never tried this before and worry that Search Engines may think this is a form of cloaking. Am I worrying about nothing? And apologies if it's already been covered here. I couldn't find it. Thanks in advance!!!!
Technical SEO | | elytical0 -
Nginx 403 and 503 errors
I have a client with a website that is hosted on a shared webserver running on an Nginx server. When I started working on the website a few months ago I found the server was throwing 100s of 403s and 503s and at one point googlebot couldn't access robots.txt. Needless to say this didn't help rankings! Now the web hosting company has partially resolved the errors by switching to a new server and I'm now just seeing intermittent spikes in Webmaster Tools of 30 to 70 403 ad 503 errors. My questions: Am I right in saying there should (pretty much) be no such errors (for pages that we make public and crawlable). Having already asked the web hosting company to look in to this. Any advice on specifically what I should be asking them to look at on the server? If this doesn't work out, does anyone having a recommendation for a reliable web hosting company in the U.S. for a lead generation website with over 20,000 pages and currently 500 to 1000 visits per day? Thanks for the help Mozzers 🙂
Technical SEO | | MatShepSEO0