Why can no tool crawl this site?
-
I am trying to perform a crawl analysis on a client's website at https://www.bravosolution.com
I have tried to crawl it with IIS for SEO, Sreaming Frog and Xenu and not one of them makes it further than the home page of the site. There is nothing I can see in the robots.txt that is blocking these agents.
As far as I can see, Google is able to crawl the site although they have noticed a significant drop in organic traffic.
Any advise would be very welcome
Regards
Danny
-
I would look into finding a method to redirect via your server rather than with javascript. This will ensure that bots can properly crawl your site.
I would also add hreflang tags which should help Google with the multiple language versions of the site.
Also in the short term you may want to do something like add a link or a delayed meta refresh just in case someone either has javascript disabled or is using script blocking extensions. This will make sure they at least see something instead of a blank page.
-
Really helpful and much appreciated - many thanks!
Danny
-
Yes that's what I said CleverPhD, I just couldn't type that fast today.
Only joking Thanks for expanding on the subject.
-
To expand on Dean's point.
If you look at the source code on https://www.bravosolution.com/ you get a bunch of JavaScript (shown below). It is basically looking at the users location and the sending them to the appropriate version of your website based on country. This is why here in the US we are sent to https://www.bravosolution.com/cms/us
Many spiders/tools (and Googlebot was not really good at this until recently) are not good at (or do not do any) crawling and executing on JavaScript so they get stuck when they hit your home page.
If you want to evaluate any of your localized sites, just run those URLs through various tools like screaming frog etc. You would then ask, "Well, how do I know that my main https://www.bravosolution.com is working properly for SEO?". I don't have as much background in how to optimize for international SEO, but you can do a several things to start with.
-
Google anything having to do with Aleyda Solis and International SEO. She posts a lot of stuff here at Moz and is pretty sharp on this stuff. There may be a more appropriate way to redirect international clients from your main page that how you are executing.
-
Run your home page through Google Webmaster Tools under Crawl > Fetch as Google. See what the page looks like
-
Double check your robots.txt to make sure you are not blocking any folders that would contain a JavaScript library. Based on the code below, I do not see you referencing any external libraries, but if you are dependent on JS to send Google, it would be worth having your developer check things
-
As with everything on what to do, it all depends. If all of your local country sites are independently ranked and successful, this main website may nor may not be doing you any favors currently if it is just a pass through with no domain authority to start with. Spend time on step #1 to see if there is anything else worth doing.
Cheers!
name="description" />
-
-
Yes, it should redirect you to the correct country version based on your IP. But I still can't crawl the site from the home page
-
Cheers Bryan - much appreciated. It's driving me crazy!
-
Hi Danny,
Have you looked at the site via http://web-sniffer.net/
It would appear that the home page is just a JavaScript redirect.
I was redirected to https://www.bravosolution.com/cms/us which then could seen via Sreaming Frog.
The reason for my (default) redirect is given by web-sniffer as:
DEFAULT CORPORATE if ( path == '' ) { path = '/cms/us
-
Interesting. I verified the robots file and tried running through screaming frog... nothing. I' will dig into this with my dev team to try and get you an answer asap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Messy older site
I am taking over a website that doesn't have any canonical tags and spotty redirects. It looks like they have http://, https://, www and non-www pages indexed but GA is just set up for the http://non-www home page. Should all versions of the site be set up in GA and Search Console? I think so but wanted to confirm. Thanks in advance.
Technical SEO | | SpodekandCo0 -
Help Setting Up 301 Redirects from Coldfusion Site to Wordpress Site.
I have created a new website and need to redirect all of the previous pages to the new one. The old website was built in coldfusion and the new site is built in wordpress. One of the pages I'm trying to redirect is www.norriseal.com/products.cfm to http://norrisealwellmark.com/products/. This is what I have in my .htaccess file <ifmodule mod_rewrite.c="">Options +FollowSymlinks
Technical SEO | | MarketHubb
RewriteEngine On
RewriteBase /
Redirect 301 /products.cfm http://norrisealwellmark.com/products/</ifmodule> The result of this redirect is http://norrisealwellmark.com/products.cfm How do I prevent the .cfm from appending to the destination URL?1 -
Tools/Software that can crawl all image URLs in a site
Excluding Screaming Frog, what other tools/software to use in order to crawl all image URLs in a site? Because in Screaming Frog, they don't crawl image URLs which are not under the site domain. Example of an image URL outside the client site: http://cdn.shopify.com/images/this-is-just-a-sample.png If the client is: http://www.example.com, Screaming Frog only crawls images under it like, http://www.example.com/images/this-is-just-a-sample.png
Technical SEO | | jayoliverwright0 -
Webmaster Tools stopped updating my sites
On Feb 6th WMT stopped updating the stats for my two main sites. I have seen other people complain about this but it seems like it's not known what causes it or if it's by design. Does anyone here have any insight on this issue? Thanks, Greg Davis AntiqueBanknotes
Technical SEO | | Banknotes0 -
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
Redirecting the .com of our site
Hey guys, A company I consult for has a different site for its users depending on the geography. Example: When a visitor goes to www.company.com if the user is from the EU, it gets redirected to http://eu.company.com If the user is from the US, it goes to http://us.company.com And so on. I have two questions: Does having a redirect on the .com will influence rankings on each specific sub-site? I suspect it will affect the .com since it will simply not get indexed but not sure if affects the sub domains. The content on this sub-sites are not different (I´m still trying to figure out why they are using the sub-domains). Will they get penalized for duplicate content? Thanks!
Technical SEO | | FDSConsulting0 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0