Finding out why you see a sudden spike in direct traffic sessions on your GA can be a bit tricky due to the lack of source information. However, we can make some safe assumptions by looking at the other metrics. Easiest way to analyze the direct traffic is by grouping the data into two categories:
1. Relevant Traffic
-Direct, relevant traffic are the loyal readers/visitors who are going straight to your website by typing in your URL or have your site bookmarked. This is the ideal scenario all of us would love to achieve. This type of traffic would have a low bounce rate and spend a considerable amount of time on your website (good avg time on page/multiple page sessions).
2. Irrelevant Traffic
-Internal traffic that is not being filtered, especially if you recently did heavy testing on the site. To avoid this install IP filters for you and your team or block internal traffic with GTM and cookies.
-Bot direct traffic, this the most common scenario and also the most complex to solve. Real user and bot traffic can share some characteristics so it is important to narrow down the one that comes only from spiders before filtering or segmenting out this traffic.
Common characteristics of bot traffic:
- A sudden spike in direct visits.
- Default Channel Grouping: Direct
- Landing Page: most of the time is your home page usually represented by a backslash / or /index.html
- Bounce Rate is usually really high close to 100%
- Average Session Time is very low: close to 0 seconds
- Page views average 1 per session
To find the bot trail, go to the Direct traffic report on Analytics select the home page (/) and start adding different secondary dimensions to find common patterns. The more you find the better!
Dimensions recommended to check:
- Browser/Browser version
- Operative system/ OS versions
- Browser size
- ISP or Network domain
- City
- Flash version
Once you find 1 or more patterns from the previous step, you can use them to create an advanced segment to exclude this traffic. This way you are able to analyze true data that isn't skewed from bots to get an actual representation of your visitors behavior.
There are thousands of bots crawling the web for different purposes; there are good and bad bots. In extreme cases you will need to block them from your server, the hosting services are usually very helpful with this type of stuff.
I hope this helps!