Www vs non www - Crawl Error 902
-
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help.
The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away.
I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing.
I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version.
Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution?
Many thanks,
Ben.
-
Glad to hear thats all fixed! Though i will say thats a very slow response time for any development / hosting company typically i would expect a maximum response time of 8 hours.. We try to keep it under 2 heh.
But yes, glad thats working for you now
-
The problem was as you anticipated, after spending a few days chasing the party who actually look after it im pleased to say i checked this morning and all is working as expected.
Thanks very much for your help Toby!
-
Thanks Toby, Ive emailed off, I expect a reply to be a couple of days away (what it normally take them) Thanks for the help thus far and ill message back when they do!
-
In that case that would probably be the best place to start. If you want any evidence for a missing A record, heres a DNS checking tool (it currently throws an error because it can't find an A record).
Let me know what they say
-
I dont have access to the DNS, my access is limited to the Magento CMS, and ftp access to the root folder.
I guess its a case of popping an email off to the guys who built the site and control the web space and getting them to address the issue?
-
Ok, so the file is working, thats a good start!
Looks like we need to go back a step in the request process then. Do you have access to the DNS settings for the http://atp-instrumentation.co.uk domain? If so, please could you check if there is an A record set for it? (looks to be registered through Civica UK Ltd - Whois Report)
What i suspect might be the case is that you're missing an A record for atp-instrumentation.co.uk but that there is one set for the www.atp-instrumentation.co.uk.
I've run a couple of tests against the domain DNS and i get nothing back for the non-www address, which is what suggests that we're not even making it as far as your servers.
To set the A record, you'll be looking for something in your control panel for 'DNS settings' or maybe 'Host Records', you should see in there ether an option to select A record settings, or perhaps a dropdown with things like A, AAA, CNAME etc. You need to:
- select 'A'
- In the domain box type: atp-instrumentation.co.uk
- In the IP box, type: 82.118.110.42
Hopefully that makes sense. If your at all unsure, let me know and i'll do what i can to help more specifically. Domain control panels are so different for each provider its difficult to provide direct instructions without knowing what your panel looks like
-
I added the line and it did indeed break the website, got internal service error etc
So it seems the file is working
-
Just a note here, the Redirects do seem to be working, so it looks like the .htaccess file IS being loaded.
Please check anyway, its possible that the redirects are comming from another location if they have been set elseware as well. From a (very quick) look, the file seems to be formatted correctly so no obvious reason for the www redirects to not be working...
-
Hmm ok,
before i do anything else, we need to make sure that the .htaccess file is actually being loaded. To do that, we need to break it for a second.
at the top of the htaccess file, put something like (specifics dont matter here)
THISisInValid666
What we're trying to do is put some invalid text into the htaccess file so that the site breaks when it loads. The idea being that we can confirm that the .htaccess file is actually being used. So if you put that in and the site throws a 500 error (when navigating to it with or without www) we can confirm that changes we make should work.
If the site continues to load without issue then we know that the .htaccess file isnt in use so we need to look at server configureation directly (specifically the AllowOverrides settings)
Once you have confirmed if it does / doesnt break, remove the line again
-
Thanks Toby, here is the entire HTACCESS file with your fix implemented. Doesn't seem to have worked if i go to
http://atp-instrumentation.co.uk with no www it still fails to load
Edited out to shorten convo
-
You're correct, you can make it a little more generic though, without seeing all of your .htaccess file, try this:
Replace:
RewriteCond %{HTTP_HOST} ^companyname.co.uk [NC]
RewriteRule ^(.*)$ http://www.companyname.co.uk/$1 [L,R=301]With:
RewriteCond %{HTTP_HOST} !^www. [NC]
RewriteRule ^ http%{ENV:protossl}://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]This is what could be called a wildcard redirect in that a direct copy paste should work for you with no need to edit. (you dont have to manually add in the correct domain name)
What it does:
- First it checks to see if the requested url has a www in it
- if it does -not- it then runs the rule, otherwise it ignores it.
- The rule first checks for http or https. Then adds in the www. followed by the domain and tld, finally adding the URI (/somepage/page for example).
- the L in square brakets means do not process anything else in the htaccess file
- the R=301 means that it will be a 301 (perminant) redirect.
If that still doesnt work for you, paste up your full .htaccess file, or you can send it to me directly if you'd rather and i'll take another look
-
Thanks Highland,
How do I go about changing this? I believe its to do with the .htacess file.
The website was developed in Magento via an external company who monitor it. Looking in the root folder I can see the htaccess file but it contains a lot lines of code and rewrites that I dont fully understand.
These lines are the one I think could be relevant so far.
############################################
enable rewrites
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^companyname.co.uk [NC]
RewriteRule ^(.*)$ http://www.companyname.co.uk/$1 [L,R=301]Then there is lots of category pages etc that are 301 redirected
Follow by
RewriteRule ^home http://www.companyname.co.uk/ [R=301,L]Then some more redirects for pages
I know this is specific but is this editable a different way in Magento? Thanks for any help offered i know this is getting more technical
-
You're 100% right. You should have one 301 redirect to the other. While there are some SEO reasons for this (mainly with duplicate content), the best reason is that it's just less confusing to end users to only have one URL to use. If your non-www has trouble loading I would say you need a 301 to the www version.
Moz restricts Top Level Domain (i.e. domain.com) and crawls accordingly. I have some set up with www.domain.com and some with just domain.com. The 301 to the www forces the issue but Moz is smart enough to crawl the proper pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice for www and non www
How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance
Technical SEO | | I.AM.Strategist0 -
Increase in Crawl Errors
I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png
Technical SEO | | abisti20 -
Massive drop off in Google crawl stats
Hi Could i get a second opinion on the following please. ON a client site we seem to have had a massive drop off in google crawling in the past few weeks, this is linked with a drop in search impressions and a slight reduction in penalty. There are no warning messages in WMT to say the site is in trouble, and it shouldn't be, however cannot get to the bottom of what is going on. In Feb the Kilobytes downloaded per day was between 2200 and about 3800, all good there. However in the past couple of weeks it has peaked at 62 and most days are not even over 3! Something odd has taken place. For the same period, the Pages crawled per day has gone from 50 - 100 down to under 3. At the same time the site speed hasn't changed - it is slow and has always been slow (have advised the client to change this but you know how it is....) Unfortunately I am unable to give the site url out so i understand that may impact on any advice people could offer. Ive attached some screen shots from WMT below. Many thanks for any assistance. stats.png
Technical SEO | | daedriccarl0 -
Salvaging links from WMT “Crawl Errors” list?
When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file. But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with. First, let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way: RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR] RewriteCond %{HTTP_HOST} ^www.mydomain.com$ RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L] But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these: www.mydomain.com/pagename1.htmlsale www.mydomain.com/pagename2.htmlhttp:// www.mydomain.com/pagename3.html" www.mydomain.com/pagename4.html/ How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo? Second, is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these: www.webutation.net titlesaurus.com www.webstatsdomain.com www.ericksontribune.com www.addondashboard.com search.wiki.gov.cn www.mixeet.com dinasdesignsgraphics.com Your help is greatly appreciated. Thanks! Greg
Technical SEO | | GregB1230 -
Soft 404 errors
Hello Everyone, I recently removed some pages and made a custom 404 page by putting "ErrorDocument 404 http://www.site.com/404.htm" in the htaccess file but WMT now reports soft 404 errors, how do I do this properly? Thanks
Technical SEO | | jwdl0 -
Crawling issues in google
Hi everyone, I think i have crawling issues with one of my sites. It has vanished form Google rankings it used to rank for all services i offered now it doesn't anymore ever since September 29th. I have resubmitted to Google 2 times and they came back with the same answer: " We reviewed your site and found no manual actions by the web spam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search. " How i detected that it may be a crawling issue is that 2 weeks ago i changed metas - metas are very slow in getting updated and for some of my pages never did update Do you know any good tools to check for bad code that could slow down the crawling. I really don't know where to look other than issues for crawling. I validated the website with w3c validator and ran xenu and cleaned these up but my website is still down. Any ideas are appreciated.
Technical SEO | | CMTM0 -
How to handle Not found Crawl errors?
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow. http://www.vistastores.com/404 But, I have question about it. Will it require to set 301 redirect for broken links which found in Google webmaster tools?
Technical SEO | | CommercePundit0 -
Linklicious and Crawl rates
Can somebody please explain me what is 'crawl rate' and how does 'linklicious' help us with it? I mean I can always visit the website and know more about it, but I want to understand the concept. Please help.
Technical SEO | | KS__0