Direct traffic is up 2100% (due to a bot/crawler I believe)
-
Hi,
The direct traffic to website www.webgain.dk has increased by over 2100% recently. I can see that most of it is from US (my target audience is in Denmark and the website is in danish).
What can I do about this? All this traffic gives my website a bounce rate of 99.91% for direct traffic. I believe it is some sort of bot/crawler. -
Already done. They also included the tip in their newsletter for beta-testers.
-
You might want to let them know about this, so they can add in documentation so future users know what is up before panicking.
-
Follow up: I have fixed this now. It was a monitoring tool by Digicure, where I have signed up to be a beta tester. Their platform checks the website like a normal visitor from servers around the world (in my test case it is Denmark and California) and thereby it looked like normal direct traffic in my data. I excluded their stated server IP-addresses in my Google Analytics filters and that helped. Thanks again guys for the help.
-
Thank you for all your great advice. I will follow them and see how it works.
-
If you are running Wordpress also check what page / pages are being accessed. I have had bots nail my wp-login like that before. If that is the case harden your installation, one thing I have found that stopped it was setting a deny in the htaccess on wp-login / wp-admin.
-
I was having the same problem ( for me it seemed to be Bings ads bot) . I used this guide below and it seems to filter out most of the bot visits.
-
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
-
It's weird that the bot is accepting cookies, but with a bounce rate that high, I agree it's probably something automated (though it could be people who were looking for something else or were directed there by an email or an app accidentally). You can look through your logs to see IP addresses and then do as Atakala says and block the traffic if you're worried about bandwidth. You can also just filter it out in GA by excluding US traffic (if your'e worried about analytics being messed up).
-
It's probably AWS, which is amazon buyable crawl service.
If ıt costs you too much then try to ban it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Targeting/Optimising for US English in addition to British English (hreflang tags)
Hi, I wonder if anyone can help? We have an e-commerce website based in the UK. We sell to customers worldwide. After the UK, the US is our second biggest market. We are English language only (written in British English), we do not have any geo-targeted language versions of our website. However, we are successful in selling to customers around the world on a regular basis. We have developers working on a new site due to launch in Winter 2021. This will include a properly managed site migration from our .net to a .com domain and associated redirects etc. Management are keen to increase sales / conversions to the US before the new site launches. They have requested that we create a US optimised version of the site. Maintaining broadly the same content, but dynamically replacing keywords: Example (clothing is not really what we sell): Replacing references to “trainers” with “sneakers”
International SEO | | IronBeetle
Replacing references ‘jumpers with “sweaters”
Replacing UK phone number with a US phone number It seems the wrong time to implement a major overhaul of URL structure, considering the planned migration from .net to .com in the not too distant future. For example I’m not keen to move British English content on to https://www.example.com/en-gb Would this be a viable solution: 1. hreflang non-us visitors directed to the existing URL structure (including en-gb customers): https://www.example.com/
2. hreflang US Language version of the site: https://www.example.com/en-us/ As the UK is our biggest market It is really important that we don’t negatively affect sales. We have extremely good visibility in SERPS for a wide range of high value/well converting keywords. In terms of hreflang tags would something like this work? Do we need need to make reference to en-gb being on https://www.example.com/ ? This seems a bit of a ‘half-way-house’. I recognise that there are also issues around the URL structure, which is optimised for British English/international English keywords rather than US English e.g. https://www.example.com/clothing/trainers Vs. https://example.com/clothing/sneakers Any advice / insight / guidance would be welcome. Thanks.0 -
Getting accurate Geo Location traffic stats in Google Analytics - HELP
One of our clients services the US and the UK, but having looked at the report over an extended period of time we can still see that the vast majority of traffic is coming from the US. I.e. our last report for March indicated that there were over 3,000 users in the US but only 6 in the UK. We know that Google Analytics works out a user’s location based on where their IP is located and not their physical location, and that this means that the data needs to be taken with a pinch of salt as it won’t always represent what you expect. That being said, we know that the traffic figures for Europe are largely inaccurate and would like to get some more accurate stats to report on. Is there a way to do so at all within Google Analytics?
International SEO | | Wagada1 -
International website sharing with .com/.au/.uk
I have a small business in the United States and would like to copy our main website for my international partners. My website is a .com. I think that their domains will end in their country codes: .au and .uk. We are open to using different domains. We plan to share blog articles and other content, but do not wish to be penalized for duplication. I have tried to read articles on this topic, but am unfamiliar with a lot of the terms. Is there any way to do this simply? Many thanks, Steph
International SEO | | essential_steph0 -
Redirect to 'default' or English (/en) version of site?
Hi Moz Community! I'm trying to work through a thorny internationalization issue with the 'default' and English versions of our site. We have an international set-up of: www.domain.com (in english) www.domain.com/en www.domain.com/en-gb www.domain.com/fr-fr www.domain.com/de-de and so on... All the canonicals and HREFLANGs are set up, except the English language version is giving me pause. If you visit www.domain.com, all of the internal links on that page (due to the current way our cms works) point to www.domain.com/en/ versions of the pages. Content is identical between the two versions. The canonical on, say, www.domain.com/en/products points to www.domain.com/products. Feels like we're pulling in two different directions with our internationalization signals. Links go one way, canonical goes another. Three options I can see: Remove the /en/ version of the site. 301 all the /en versions of pages to /. Update the hreflangs to point the EN language users to the / version. **Redirect the / version of the site to /en. **The reverse of the above. **Keep both the /en and the / versions, update the links on / version. **Make it so that visitors to the / version of the site follow links that don't take them to the /en site. It feels like the /en version of the site is redundant and potentially sending confusing signals to search engines (it's currently a bit of a toss-up as to which version of a page ranks). I'm leaning toward removing the /en version and redirecting to the / version. It would be a big step as currently - due to the internal linking - about 40% of our traffic goes through the /en path. Anything to be aware of? Any recommendations or advice would be much appreciated.
International SEO | | MaxSydenham0 -
International SEO Subfolders / user journey etc
Hi According to all the resources i can find on Moz and elsewhere re int seo, say in the context of having duplicate versions of US & UK site, its best to have subfolders i.e. domain.com/en-gb/ & domain.com/en-us/ however when it comes to the user journey and promoting web address seems a bit weird to say visit us at: domain.com/en-us/ !? And what happens if someone just enters in domain.com from the US or UK ? My client wants to use an IP sniffer but i've read thats bad practice and should employ above style country/language code instead, but i'm confused about both the user journey and experience in the case of multiple sub folders. Any advice much appreciated ? Cheers Dan
International SEO | | Dan-Lawrence0 -
3 month old site lost almost complete traffic overnight
Hi All, I started a Indian coupon and deal site http://www.couponspy.in/ around 3 month ago and traffic increased almost daily. But yesterday my site lost almost all of its traffic. Keywords which ranked 1-5 lost around 4-15 places and keywords which ranked 6-20 lost ca. 20-50 places. The Moz Crawl Diagnostics doesn't indicate any mayor issues. Has there been a Google Panda update in India? Reasons why my site has been affected? Please help!!!! 😉 I have seen the same traffic decrease on other coupon start ups, eg https://www.cuponation.in/ and https://www.cuponation.in/ Did we all make the same mistake? Any guesses?
International SEO | | ParvatiSingh0 -
Best URL structure for Multinational/Multilingual websites
Hi I am wondering what the best URL format to use is when a website targets several countries, in several languages. (without owning the local domains, only a .com, and ideally to use sub-folders rather than sub-domains.) As an example, to target a hotel in Sweden (Google.se) are there any MUST-HAVE indicators in the URL to target the relevant countries? Such as hotelsite.com**/se/**hotel-name. Would this represent the language? Or is it the location of the product? To clarify a bit, I would like to target around 10 countries, with the product pages each having 2 languages (the local language + english). I'm considering using the following format: hotelsite.com/en/hotel-name (for english) and hotelsite.com/se/hotel-name (for swedish content of that same product) and then using rel=”alternate” hreflang=”se-SV” markup to target the /se/ page for Sweden (Google.se) and rel=”alternate” hreflang=”en” for UK? And to also geotarget those in Webmaster tools using those /se/ folders etc. Would this be sufficient? Or does there need to be an indicator of both the location, AND the language in the URLs? I mean would the URL's need to be hotelsite.com/se/hotel-name/se-SV (for swedish) or can it just be hotelsite.com/se/hotel-name? Any thoughts on best practice would be greatly appreciated.
International SEO | | pikka0 -
Global SEO - How quickly/aggressively should one expand into multiple countries?
SITUATION: Our client is a global company lacking the global presence, so naturally the idea is performing international/global SEO in each country. For benchmarking purposes, our plan is to focus on a select number of keywords (ie 8-15) for each country and begin link building within each respective country. All SEO effort (ie. link building) will be for sub-folders (ie. www.client.com/subfolder/) on the same top level domain. Note, each country may have multiple languages, so each language will be broken out as it's own unique SEO campaign with it's very own strategy and link building efforts. For example: Mexico has 2 languages (English & Spanish) and will be considered 2 separate campaigns. PROBLEM: The client wants to be extremely aggressive and perform SEO on 3 new countries every month. This amounts to 36 new countries/SEO campaigns per year. Assuming each country has 2 languages each, we are looking at 6 SEO campaigns per month, or 72 per year. Our concern is that since all SEO effort will be performed on the same top level domain, we may be growing too fast and the search engines may consider the addition of these new pages and links to be too 'spammy'. We'd love to hear some feedback or personal experience on what might be considered a "safe" or "healthy" expansion into different countries. Thanks!
International SEO | | ByteLaunch0