Direct To Site Traffic Decline 2
-
This is an update on a post I made a few weeks ago. I notice a siginicant drop in direct traffic this year specifically from Chrome 43.0. I wanted to include data to get a deeper perspective. I have included data on the first 15 weeks of 2016 and 2017. It seems like a spam bot but I would like to hear other opinions. Thank you!
-
It definitely looks like you've got yourself some bot traffic, my friend. Here are a few examples that stand out to me the most:
- Over 99% of visitors to your website are considered first-time visitors.
- Your Bounce Rate is at 99%.
- Page Sessions are a little over 1 page per session.
- Avg. Session Duration is 2 seconds.
All these reasons listed above are usually tell-tale signs of bot traffic. 99% of new visitors for direct traffic doesn't make much sense. That would mean almost every user is finding you for the first time by typing in your exact URL. Bounce Rate being at 99% also doesn't make much sense unless the page the users are landing on isn't working. Additionally, the average user only stays on the page for 2 seconds and then exits the site. Obviously, this isn't the case. In order to protect yourself against some of the bot traffic, make sure you have “Exclude traffic from known bots and spiders” in the "View Settings" of your Google Analytics.
Hope this helps some!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
How do I know for sure if my site has been slapped?
I'm new to this SEO business - and I focus on inbound marketing. My client's site is www.SubconsciousMind.com. Just a few weeks ago it was showing in the top search results for several major keywords. Now, it has disappeared all together and there are competitors showing that have very little SEO (metatarsi not set up properly, etc.). So, I know there has to be an opportunity. Some obvious things:
White Hat / Black Hat SEO | | SoundsLikeJoy
There aren't a lot of links and the majority seem to be bad (bad linking farms)
Social media is set up by a robot
Articles are poorly written obviously ONLY for SEO My client hired a SEO company awhile back to get results, not understanding black hat / white hat and it worked for several years. Now - it is really hurting her. The sites she is linking to doesn't have any contact info to get the "unlinked." I've read I can use the disavow tool. I asked her if she got anything from Google about "being slapped". She doesn't even receive the emails to her site because she trusted someone else to set it all up. Should I rebuild from scratch? Any recommendations? We are funning adwords now as a quick fix.0 -
Why is a site that does all the wrong things dominating?
A site that is a competitor of ours is basically dominating the search results despite doing everything you're not supposed to do, including: Purchasing links Having content that is thin, templated, and duplicate - adds little value Owning half a dozen other sites for linking to each other (link wheel?) We spend a lot of time on our content and making it the most useful it can be for our visitors. Granted our site is newer but we avoid these gray/black hat practices and yet we're not ranking nearly as high. What gives?
White Hat / Black Hat SEO | | Harbor_Compliance0 -
Our site has too many backlinks! How can we do a bad backlink audit?
Webmaster Tools is saying we have close to 24 million links to our site. The site has been around since the mid 90s and has accumulated all these links since. We also have our own network of sites that have links in their templates to our main site. I'm fighting to get these links "nofollow"'d but upper management seems scared to alter this practice. This past year we've found our rankings have dropped significantly and suspect it's due to some spammy backlinks or being penalized for doing an accidental link scheme network. 24 million links is too many to check manually for using the disavow tool and it seems that bulk services out there to check backlinks can't even come close. What's an SEO to do?
White Hat / Black Hat SEO | | seoninjaz0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Are duplicate item titles harmful to my ecommerce site?
Hello everyone, I have an online shopping site selling, amongst other items, candles. We have lots of different categories within the LED candles category. One route a customer can take is homepage > LED candles > Tealights. Within the tealights category we have 7 different products which vary only in colour. It is necessary to create separate products for each colour since we have fantastic images for each colour. To target different keywords, at present we have different titles (hence different link texts, different URLs and different H1 tags) for each colour, for example "Battery operated LED candles, amber", "Flameless candles, red" and "LED tealights, blue". I was wondering if different titles to target different keywords is a good idea. Or, is it just confusing to the customer and should I just stick with a generic item title which just varies by colour (eg. "LED battery candles, colour")? If I do the latter, am I at risk of getting downranked by Google since I am duplicating the product titles/link texts/URLs/H1 tags/img ALTs? (the description and photos for each colour are unique). Sorry if this is a little complicated - please ask and I can clarify anything...because I really want to give the best customer experience but still preserve my Google ranking. I have attached screenshots of the homepage and categories to clarify, feel free to go on the site live too. Thank you so much, Pravin BqFCp.jpg KC2wB.jpg BEcfX.jpg
White Hat / Black Hat SEO | | goforgreen0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1