Direct traffic is up 2100% (due to a bot/crawler I believe)
-
Hi,
The direct traffic to website www.webgain.dk has increased by over 2100% recently. I can see that most of it is from US (my target audience is in Denmark and the website is in danish).
What can I do about this? All this traffic gives my website a bounce rate of 99.91% for direct traffic. I believe it is some sort of bot/crawler. -
Already done. They also included the tip in their newsletter for beta-testers.
-
You might want to let them know about this, so they can add in documentation so future users know what is up before panicking.
-
Follow up: I have fixed this now. It was a monitoring tool by Digicure, where I have signed up to be a beta tester. Their platform checks the website like a normal visitor from servers around the world (in my test case it is Denmark and California) and thereby it looked like normal direct traffic in my data. I excluded their stated server IP-addresses in my Google Analytics filters and that helped. Thanks again guys for the help.
-
Thank you for all your great advice. I will follow them and see how it works.
-
If you are running Wordpress also check what page / pages are being accessed. I have had bots nail my wp-login like that before. If that is the case harden your installation, one thing I have found that stopped it was setting a deny in the htaccess on wp-login / wp-admin.
-
I was having the same problem ( for me it seemed to be Bings ads bot) . I used this guide below and it seems to filter out most of the bot visits.
-
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
-
It's weird that the bot is accepting cookies, but with a bounce rate that high, I agree it's probably something automated (though it could be people who were looking for something else or were directed there by an email or an app accidentally). You can look through your logs to see IP addresses and then do as Atakala says and block the traffic if you're worried about bandwidth. You can also just filter it out in GA by excluding US traffic (if your'e worried about analytics being messed up).
-
It's probably AWS, which is amazon buyable crawl service.
If ıt costs you too much then try to ban it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Traffic drop after hreflang tags added
We operate one company with two websites each serving a different location, one targeting EU customers and the other targeting US customers. thespacecollective.com (EU customers) thespacecollective.com/us/ (US customers) We have always had canonical tags in place, but we added the following hreflang tags two weeks ago (apparently this is best practice); EU site (thespacecollective.com) US site (thespacecollective.com/us/) Literally the same day we added the above hreflang tags our traffic dropped off a cliff (we have lost around 70-80% on the EU site, and after a minor recovery, 50% on the US site). Now, my first instinct is to remove the tags entirely and go back to just using canonical, but if this is truly best practice, that could do more damage than good. This is the only change that has been made in recent weeks regarding SEO. Is there something obvious that I am missing because it looks correct to me?
International SEO | | moon-boots0 -
Getting accurate Geo Location traffic stats in Google Analytics - HELP
One of our clients services the US and the UK, but having looked at the report over an extended period of time we can still see that the vast majority of traffic is coming from the US. I.e. our last report for March indicated that there were over 3,000 users in the US but only 6 in the UK. We know that Google Analytics works out a user’s location based on where their IP is located and not their physical location, and that this means that the data needs to be taken with a pinch of salt as it won’t always represent what you expect. That being said, we know that the traffic figures for Europe are largely inaccurate and would like to get some more accurate stats to report on. Is there a way to do so at all within Google Analytics?
International SEO | | Wagada1 -
Why Google is not indexing each country/language subfolder on the ranks?
Hi folks, We use Magento 2 for the multi-country shops (its a multistore). The URL: www.avarcas.com The first days Google indexed the proper url in each country: avarcas.com/uk avarcas.com/de ... Some days later, all the countries are just indexing / (the root). I correctly set the subfolders in Webmaster tools. What's happening? Thanks
International SEO | | administratorwibee0 -
/en-us/ Outranking Root Domain and other hreflang errors
I'm working with a new site that has a few regional sites in subdirectories /en-us/, /en-au/, etc and just noticed that some of our interior pages (ourdomain.com/en-us/interior-page1/ ) are outranking the equivalent ourdomain.com/interior-page1. This only occurs in some SERPS while others correctly display the non-regional result. I was told we have hreflang tags implemented correctly in the meta information of each of our pages but have yet to research deeply. Should we even have a /en-us/ version when our root domain is the default version, in english, and targeted to US primarily? Any help would be appreciated as I am a little lost. Cheers, Andrew
International SEO | | AndyMitty0 -
Multi country targeting for listing site, ccTLD, sub domain or .com/folder?
Hi I know this has been covered in a few questions but seen nothing recent that may take into account changes google may have applied. We would like to target multiple english speaking counties with a new project and I'm a little unsure as to whether ccTLD, subdomain or subfolders are the best way to publish country specific information. Can anyone shed some light on this?
International SEO | | Mulith0 -
Http://us.burberry.com/: Big traffic change for top URL (error 593f1ceb2d67)
Please forgive duplicating this question on the SEOMoz & Webmaster Tools forum but I'm hoping to hit both audiences with this question... A few days ago I noticed that our US homepage (us.burberry.com) had dropped from PR5 to PR0, and the page has been deindexed by Google. After checking Webmaster Tools I also received the following message: http://us.burberry.com/: Big traffic change for top URL April 2, 2012Search results clicks for http://us.burberry.com/ have decreased significantly.Message ID: 593f1ceb2d67.We're not doing any link building at all (we've enough on-site issues to deal with). The only changes I have made are adding Google Analytics to the website, uploading sitemaps via Webmaster Tools (it's not linked to from robots.txt yet), and setting the burberry.com and www.burberry.com geo-location settings to 'unlisted' (we want uk.burberry.com appearing in the UK results, us.burberry.com appearing in the US results etc rather than www.burberry.com).I've reversed the geo-location settings but I doubt this would have caused this. We've duplicate copies of our homepage (such as us.burberry.com/store//) from typos in inbound links (and bad programming that allows them to work rather than 404'ing) but I don't think any of this is new. What I don't understand is (a) why this is happening now and (b) why is this just affecting our US homepage? We've ~40 different duplicates of the homepage (us, uk, ca, pt, ro, sk etc etc) so why is the US site being affected and not the others? Does anyone know if this is due to an algorithm change by Google or something else all together? Background:Our website www.burberry.com has 46 subdomains such as uk.burberry.com, ca.burberry.com and us.burberry.com. There is a lot of duplicate content on each subdomain (including basic things like tracking parameters in URLs) and across subdomains (uk.burberry.com/store & us.burberry.com/store are exactly the same), there's very little text on the site (its nearly all images), as well as poor redirects, inaccessible content (AJAX/Flash) and a whole host of basic SEO things that aren't being done correctly. I've joined the company in the last few months and have started addressing these issues but I've got a LOT of work to do yet.One thing that we have in our favour is a link profile that is as clean and natural as they come - there was only ever one link building campaign performed (which was before my time) and I had all of those links removed as soon as I joined the company.Any help would be greatly appreciated! Thanks for your timeDean RoweEdit: us.burberry.com 301 redirects to us.burberry.com/store/ as explained on the webmaster tools forum, but I don't believe this is the cause as its the same across all subdomains.
International SEO | | FashionLux0 -
Optimizing terms with accents/tildes in Spanish
Hello all, quick question. We are optimizing for a keyword that includes an accent in Spanish. Is it better to use the accented or regular form (i.e. inglés vs. ingles)? Also, is there any distinction between accents (áéí...) and the ene (ñ) in terms of strategy/best practices? Does this accent issue have a huge impact on ranking?
International SEO | | CuriosityMedia0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0