Direct traffic is up 2100% (due to a bot/crawler I believe)
-
Hi,
The direct traffic to website www.webgain.dk has increased by over 2100% recently. I can see that most of it is from US (my target audience is in Denmark and the website is in danish).
What can I do about this? All this traffic gives my website a bounce rate of 99.91% for direct traffic. I believe it is some sort of bot/crawler. -
Already done. They also included the tip in their newsletter for beta-testers.
-
You might want to let them know about this, so they can add in documentation so future users know what is up before panicking.
-
Follow up: I have fixed this now. It was a monitoring tool by Digicure, where I have signed up to be a beta tester. Their platform checks the website like a normal visitor from servers around the world (in my test case it is Denmark and California) and thereby it looked like normal direct traffic in my data. I excluded their stated server IP-addresses in my Google Analytics filters and that helped. Thanks again guys for the help.
-
Thank you for all your great advice. I will follow them and see how it works.
-
If you are running Wordpress also check what page / pages are being accessed. I have had bots nail my wp-login like that before. If that is the case harden your installation, one thing I have found that stopped it was setting a deny in the htaccess on wp-login / wp-admin.
-
I was having the same problem ( for me it seemed to be Bings ads bot) . I used this guide below and it seems to filter out most of the bot visits.
-
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
-
It's weird that the bot is accepting cookies, but with a bounce rate that high, I agree it's probably something automated (though it could be people who were looking for something else or were directed there by an email or an app accidentally). You can look through your logs to see IP addresses and then do as Atakala says and block the traffic if you're worried about bandwidth. You can also just filter it out in GA by excluding US traffic (if your'e worried about analytics being messed up).
-
It's probably AWS, which is amazon buyable crawl service.
If ıt costs you too much then try to ban it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - Alternatives to Automatic IP re-direct
Hello, When doing international SEO I've read that it's not good practice to automatically re-direct users to the correct part of the website based on their IP address. But what alternatives are there to this? Let's say you're targeting the US and the UK through multiregional SEO. What can you do to ensure that users from the US go to the US sub-directory and that users from the UK go to the UK sub-directory? In Moz's international SEO guide it says that: "If you choose to try to guess at the user’s language preference when they enter your site, you can use the browser’s language setting or the IP address and ask the user to confirm the choice. Using JavaScript to do this will ensure that Googlebot does not get confused. Pair this with a good XML sitemap and the user can have a great interaction. Plus, the search engines will be able to crawl and index all of your translated content." Can anyone explain this further? Any help would be much appreciated! Thanks in advance
International SEO | | SEOCT0 -
Why Google is not indexing each country/language subfolder on the ranks?
Hi folks, We use Magento 2 for the multi-country shops (its a multistore). The URL: www.avarcas.com The first days Google indexed the proper url in each country: avarcas.com/uk avarcas.com/de ... Some days later, all the countries are just indexing / (the root). I correctly set the subfolders in Webmaster tools. What's happening? Thanks
International SEO | | administratorwibee0 -
Redirected traffic and SEO problem
Hi all, I have a bit of a search engine predicament and I can't find the answer anywhere. It's a bit of a complicated one so please bear with me 🙂 ... I'm a Freelance Copywriter, I recently started the business, I've also recently moved to New Zealand. As such I'm looking for business back in the U.K. (As that's where my network is), but also locally, in NZ. I've purchased both the .co.uk and .co.nz domain names (http://www.inspirecontent.co.uk and http://www.inspirecontent.co.nz) The way that the domain provider / host has set these up is for one to redirect to another. Currently if someone visits www.inspirecontent.co.nz it redirects to the U.K. Site. That's less than ideal for me, because I dont want NZ traffic (i.e potential leads) to think I'm a U.K. Based business. my questions are as follows: 1. Will the redirect to the U.K. domain prevent me from appearing in NZ search (I.e if someone searches via google.co.nz) I'm really struggling to rank at the moment, I'm working on more content but if the redirect is a problem then I need to know about it so that I can find a work around. 2. Any suggestions on the best approach to the work around? It would be great if the URLs didn't change! So that you wind up from the U.K. on the U.K site, and if you're from NZ, you land on and stay on the NZ domain, but I'm not sure how to achieve that. One option, I think, would be to have two different websites, hosted separately, but I hear that duplicated content is bad for SEO? Thanks all in advance Kind regards
International SEO | | Andrea_howey0 -
International Confusion between .com and .com/us
A question regarding International SEO: We are seeing cases for many sites that meet these criteria: -International sites that have www.example.com/ ip redirecting to country site based on ip redirect (ex. www.example.com/ 301 to www.example.com/us -There is a desktop + mobile site (www.example.com + m.example.com) The issue we see is Google shows www.example.com/ in US search results instead of www.example.com/us in search results. Since the .com/ redirects, there is no mobile version, and www.example.com/ also shows up in mobile SERPs instead of m.example.com/us. My questions are: 1. If www.example.com/ is redirecting users and Googlebot, why is Googlebot caching it with the content of www.example.com/us? 2. Why is www.example.com/ showing up in SERPs instead of www.example.com/us? 3. How can we help Google display www.example.com/us and m.example.com/us in SERPs instead of www.example.com/? Thanks!!
International SEO | | FranFerrara0 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
Setting up I.P Filter Google Analytics - I.p ending with 0/24
Hi everyone, Your help would be much appreciated for the following: I am trying to setup I.P filters for our Google Analytic account to exclude internal traffic. We are located in multiple locations and each location have multiple I.p addresses. The I.P addresses we have end either by 0/24 which apparently means they provide a range from 0 to 255 and or 128/25. I have tried to setup the I.P addresses in different formats on the GA filter but they are apparently are not valid: example of one setup I tried: 1**.\2**.\8*.([0-256]) I have gone through the Filter setup guide from Google but I must be doing something wrong- probably to do on how I setup the I.P's ending with 0/24 and 128/25 If anyone could help me on how I can set up the I.P filters Google analytic would be great. The I.P addresses look like the following (changed digits): Location 1: 174.177.179.0/25 174.177.179.128/25 Location 2: 196.222.87.0/24
International SEO | | AlphaDigital2
194.59.197.0/24 Thanks you so much for your help, L.0 -
International SEO Subfolders / user journey etc
Hi According to all the resources i can find on Moz and elsewhere re int seo, say in the context of having duplicate versions of US & UK site, its best to have subfolders i.e. domain.com/en-gb/ & domain.com/en-us/ however when it comes to the user journey and promoting web address seems a bit weird to say visit us at: domain.com/en-us/ !? And what happens if someone just enters in domain.com from the US or UK ? My client wants to use an IP sniffer but i've read thats bad practice and should employ above style country/language code instead, but i'm confused about both the user journey and experience in the case of multiple sub folders. Any advice much appreciated ? Cheers Dan
International SEO | | Dan-Lawrence0 -
Multiple Regional Domains - such as .co.uk / .de etc for one brand
Hello, We are in the process of building up our version 2 for our site, currently we have only one domain (i.e. xxxxx.com). Our target audience is distributed among various regions and speak different languages, we would like to know which will benefit us more: a) by having one root domain and then having folders based on automatic IP detection, for example the customer opening a website in Japan would see the domain as: www.xxxx.com/jp. B) or is it better to have different domains so in the above case it will be www.xxxx.co.jp. The content on the site will be different based on the regional demand, so of course the language will be Japanese and the content will also be aligned with the Japanese community. We plan to start with 5 different markets (UK/US/AU, Japan, China, Germany, Spanish speaking countries). We would appreciate if you can suggest us the best route to achieve the best results. Thank you, SK
International SEO | | sidkumar0