Direct traffic is up 2100% (due to a bot/crawler I believe)
-
Hi,
The direct traffic to website www.webgain.dk has increased by over 2100% recently. I can see that most of it is from US (my target audience is in Denmark and the website is in danish).
What can I do about this? All this traffic gives my website a bounce rate of 99.91% for direct traffic. I believe it is some sort of bot/crawler. -
Already done. They also included the tip in their newsletter for beta-testers.
-
You might want to let them know about this, so they can add in documentation so future users know what is up before panicking.
-
Follow up: I have fixed this now. It was a monitoring tool by Digicure, where I have signed up to be a beta tester. Their platform checks the website like a normal visitor from servers around the world (in my test case it is Denmark and California) and thereby it looked like normal direct traffic in my data. I excluded their stated server IP-addresses in my Google Analytics filters and that helped. Thanks again guys for the help.
-
Thank you for all your great advice. I will follow them and see how it works.
-
If you are running Wordpress also check what page / pages are being accessed. I have had bots nail my wp-login like that before. If that is the case harden your installation, one thing I have found that stopped it was setting a deny in the htaccess on wp-login / wp-admin.
-
I was having the same problem ( for me it seemed to be Bings ads bot) . I used this guide below and it seems to filter out most of the bot visits.
-
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
-
It's weird that the bot is accepting cookies, but with a bounce rate that high, I agree it's probably something automated (though it could be people who were looking for something else or were directed there by an email or an app accidentally). You can look through your logs to see IP addresses and then do as Atakala says and block the traffic if you're worried about bandwidth. You can also just filter it out in GA by excluding US traffic (if your'e worried about analytics being messed up).
-
It's probably AWS, which is amazon buyable crawl service.
If ıt costs you too much then try to ban it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Almost zero traffic outside Finland
Hello, rankings.jpg I am becoming a bit clueless with our business website. Our site is doing really well in Finland and with Finnish language. Even though our business is fairly new, we have been able to pass many of our competitors in the search only after about year of operating. What confuses and worries me though is the fact that our English content is not ranking at all. The aim for the English content is to be general and reaching audiences worldwide. But as you can see in the image attached, we are doing really bad for example in UK, which is one of our main markets. I've been doing active keyword research, built high quality and natural links and writing long and keyword rich content on our blog but still our rankings don't seem to change outside Finland. I would be interested in knowing, what I am doing wrong and what would be the right steps to start improving the situation?
International SEO | | tuomashaapala0 -
Multiregional / Multilingual SEO - What do you do when there is no equivalent page?
Hello, We're building out a small number of pages for the US in a sub-folder .com/us. The idea is to show US specific pages to users in that location. However, we also have a number of pages which we will not be creating for the US as they're not relevant. I am planning on geo-targeting the US folder to instruct the search engines that this subfolder should appear in the US SERPS but since it isn't an exact science, there is a chance that US visitors may land on these non-us pages which could potentially give them a bad user experience. What should we do in instances where a US user lands on a non-us page with no equivalent page? Any help would be much appreciated!
International SEO | | SEOCT1 -
Shall I automatically redirect international visitors from www.domain.com to e.g. www.domain.com/es? What is best SEO practice?
We have chosen the one domain approach with our international site having different language versions in subdirectory of main domain:
International SEO | | lcourse
www.domain.com/es
www.domain.com/it
etc. What is SEO-wise best practice for implementing international index pages. I see following options: entering www.domain.com will display without redirection the index page in language of user (e.g based on IP or browser) in www.domain.com
Example: www.booking.com entering www.domain.com will always show English index page.
Additionally one may display a message in the header if IP from other country with link to other language version.
Example: www.apple.com entering www.domain.com will always redirect automatically to country specific subdirectory based on IP
Example: www.samsung.com Any thoughts/suggestions on what may be best solution from a SEO perspective? For a user I believe options 1) & 3) are preferable.0 -
Huge increase in US direct visits to a UK site, why?
Hi all, My UK website usually gets around 10,000 direct (Direct in Analytics) visits per month however for August this has shot up to 24,000! However the majority of these direct visits seem to be coming from the US and as a result the bounce rate is through the roof, 84%! Why would my UK based site suddenly be receiving huge amounts of US visits? Any ideas?
International SEO | | MarkHincks0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Best URL structure for Multinational/Multilingual websites
Hi I am wondering what the best URL format to use is when a website targets several countries, in several languages. (without owning the local domains, only a .com, and ideally to use sub-folders rather than sub-domains.) As an example, to target a hotel in Sweden (Google.se) are there any MUST-HAVE indicators in the URL to target the relevant countries? Such as hotelsite.com**/se/**hotel-name. Would this represent the language? Or is it the location of the product? To clarify a bit, I would like to target around 10 countries, with the product pages each having 2 languages (the local language + english). I'm considering using the following format: hotelsite.com/en/hotel-name (for english) and hotelsite.com/se/hotel-name (for swedish content of that same product) and then using rel=”alternate” hreflang=”se-SV” markup to target the /se/ page for Sweden (Google.se) and rel=”alternate” hreflang=”en” for UK? And to also geotarget those in Webmaster tools using those /se/ folders etc. Would this be sufficient? Or does there need to be an indicator of both the location, AND the language in the URLs? I mean would the URL's need to be hotelsite.com/se/hotel-name/se-SV (for swedish) or can it just be hotelsite.com/se/hotel-name? Any thoughts on best practice would be greatly appreciated.
International SEO | | pikka0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0