Our organic homepage traffic just recently spiked from about a typical under 20 per weekend to about 820 -- what could be causing this?
-
Website: http://www.myinjuryattorney.com
Our homepage typically receives under 20 organic visitors per weekend, but I just checked traffic this morning, and it was at a whopping 821 for just Saturday and Sunday. It's already at 212 this morning.
I'm heavily assuming this is fake traffic as there were about 818 drop offs after visiting the homepage, an 84.41% bounce rate, and an average session duration of 5 seconds. Our typical metrics -- last weekend for example, were: 13 visitors to the homepage, 38% bounce, and an average session duration of 1 minute 26 seconds.
Does anyone know who or what could be causing this? Could it be a competitor using negative SEO of some sort? Thank you in advance.
-
Hi Rick, sorry for the hiatus, I have a couple other questions for you.
1. Have you set up conversion tracking? Has there been an increase in conversions?
2. Do you have any campaigns running? Print, broadcast, radio, etc.? Many offline campaigns cause a boost in organic searches for my clients. -
-
Hi Brett - I was able to go into this filter and I didn't see anything out of the ordinary.
-
Hi Rick,
Since I haven't seen a response yet, I'm assuming I wasn't clear enough in my explanation so I went into an unfiltered view for one of my clients and found some ghost spam, then skitched it so you could see how to get there and examine it yourself on your website. Hope this helps!
-
Not just yet. Click on the secondary dimension drop down bar and type in hostname, or find it under the behavior bar. You can also look at just google traffic by clicking on Google first then setting the hostname as the secondary dimension. It should become apparent at that point if you have a lot of bots spoofing your traffic with a fake source.
-
Hi Brett - thank you! Do I have this set up right? I'm just seeing normal sources from what I can tell. https://www.screencast.com/t/t9VW5tSz
-
Yes, because this filter is based on the hostname. If a bot is spoofing the source but does not have a valid hostname (and most will not) then it will be filtered out by the include filter. Go into your GA data, go down to the source/medium report under acquisitions and set the secondary dimension to hostname.
If you're seeing something like (not set) next to Google/Organic traffic in the source then that's spam. I've got some in my unfiltered views as well. From the article I sent you:
"On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you've inserted GA tracking code."
So just make sure you compile a list of all the valid hostnames for your website and you should be fine.
-
Hi Brett,
Thank you for the info. Would all of this still apply if the traffic is considered organic and not referral?
-
Hi Rick,
Try checking your traffic against the secondary dimension "hostname". If a large number appear to be invalid hostnames then you've got yourself an answer. Referral traffic, also known as ghost spam, can be removed with an include filter. Moz wrote a great guide on how to do this here: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
If you're at all concerned that the traffic could be ghost spam and you don't have this filter in place, then an easy means of checking is to implement the filter on a test view and see how it impacts your data. Just make sure you create a new view to test it on first, because I had a client accidentally exclude all of his valid hostnames and lost every last bit of actionable data.
Hope this helps!
-
Have you checked the landing pages that relate to the keywords? In that case you would hopefully be able to see what kind of pages are trending at the moment and increasing your traffic. A big increase in traffic might have an influence, but in the end 800 searches more daily are not that much.
-
I noticed a few months ago, that type of traffic was not just showing up under referral but also under organic in GA. As far as i am concerned, just another problem plaguing GA/GWMT.
Matt
-
Hi Martijn,
I'm checking now and for some reason it's not reflecting the high # of visitors. All of the queries also seem normal, and it's showing that none have been repeated over 5 times. There are however a ton of different, but pretty normal ones appearing. Any additional insight given that info? Thanks!!
-
Hi Matt, thanks for the quick answer! All of this traffic is actually showing up under our organic rather than referral
-
Sounds like you are experiencing "Referral Spam". Have you checked the sources in Google Analytics? It is essentially a spammy way of advertising domains and services.
Here are a few links to help you understand and fix the issue:
- https://moz.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data
- If you have GA: https://support.google.com/analytics/answer/1034842?hl=en
Good Luck,
Matt
-
Hi Rick,
If you connected Google Search Console to your site you should be able to see in the Search Analytics data what kind of keywords did trigger the traffic. It could always be fake traffic but sometimes you just get lucky with certain keywords that you appear to rank for all of a sudden.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I lost traffic from my website and the rankings also gone down... What should I do?
I've started working on a project recently (for 20 days) and it got average 1800 visitors per day and the ranking were seemed good. When I started that project, I saw that there were too many plugins installed. I removed unnecessary plugins and keep the importance ones. And I started modify some pages considering SEO perspective. Few days ago I created a backlink in reddit; from where I got way too much traffic in the site at the time and the server gone down. So I have to change the server. Now I see the drastic drop down in both ranking and the traffic. I am wondering if the ranking affected when I changed the server? Or Is there any other way to check why my ranking and traffic gone down?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Help identifying cause for total rank loss
Hello, Last week I noticed one of my pages decreased in rank for a particular query from #8 to #13. Although I had recently made a few minor edits to the page (added an introductory paragraph and left-column promo to increase word count), I thought the reason for the decrease was due to a few newly ranked pages that I hadn't seen before. In an attempt to regain my original position, I tried to optimize the meta title for the singular form of the word. After making this change, I fetched and rendered the page as Google (status = partial) and submitted the page for indexing (URL only, not including on-page links). Almost immediately after submitting, the page dropped from #13 out of the top 50. I've since changed the meta title back to what it was originally and let Google crawl and index the page on its own, but the page is still not in the top 50. Could the addition of the page description and left column promos tipped the scales of keyword stuffing? If I change everything back to the way it was originally, is it reasonable to think I should regain my original position below the new pages? Any insights would be greatly appreciated!
White Hat / Black Hat SEO | | jmorehouse0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Is this traffic drop do to cutting backlinks or Penguin 2.0 (Graphs attached)
I've attached both graphs of the traffic drop. Our website rankings have been steadily declining since May of 2013. We have mostly return customers or our drop would have been much more severe. There's never been any warnings in GWT We cut a bunch (but not all) of our paid links in May of 2013. We didn't have a manual penalty or anything, we just wanted to see what happened if we moved towards being white hat. When our rankings plumited, we quit cutting links. We currently have about 30% paid links. Penguin 2.0 was May 22, 2013 In looking at these graphs, was it our cutting links that caused the traffic drop, or was it Penguin 2.0? I'm looking for people who have experience in diagnosing a "Unique Visits" Google analytics graph for Penguin and have experience with what happens when you cut links. It looks like, in viewing the graphs, that May 23 was more the day that the big drop happened, but you guys have more experience with this than me. Thank you. ga.png ga2.png
White Hat / Black Hat SEO | | BobGW0 -
Do inbound links from forums hurt our traffic?
We have a manual action against us on Google webmaster tools for unnatural links. While evaluating our back links, I noticed that forums with low page rank/domain authority are linking to us. Is this hurting us?
White Hat / Black Hat SEO | | imlovinseo0 -
11 000 links from 2 blogs + Many bad links = Penguin 2.0\. What is the real cause?
Hello, A website has : 1/ 8000 inbound links from 1 blog and 3000 from another one. They are clean and good blogs, all links are NOT marked as no-follow. 2/ Many bad links from directories that have been unindexed or penalized by Google On the 22nd of May, the website got hurt by Penguin 2.0. The link profile contains many directories and articles. The priority we had so far was unindexing the bad links, however shall we no-follow the blog links as well? Thanks!
White Hat / Black Hat SEO | | antoine.brunel0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Any recent discoveries or observations on the "Official Line" of incoming link penalization?
I know this is always a contentious issue and that the official, or shall we say semi-official line is that you can't be penalized for incoming links, as you can't control who links to you (aside of course from link buying, and other stuff that Google feels it can work out). I was wondering if anyone had any recent discoveries or observations on this? Obviously there's the problem that is usually brought up where you could damage a competitor buy link building to them with spammy links, etc... hence the half denial of it being an issue... but has anyone seen or hear anything on it recently, or experienced something relevant?
White Hat / Black Hat SEO | | SteveOllington1