A campaign ghost keeps returning to my Google Analytics - Help!
-
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site).
I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign.
Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing.
However they are still showing as sessions even in the new view (created after removing the campaign links in question).
Is there anyway to stop this happening!? I want to be able to report on sessions correctly.
Thanks,
Sam
-
Thanks Kristina,
I set-up the filter in the following way:
Filter type - Custom
Search and Replace
Filter field - Campaign nameSearch string - 'nameofmycampaignimremoving'
Replace string -
I figured by adding in the campaign name but not replacing it with anything, Google Analytics should now pick this traffic up as direct.
The campaign is no longer showing in my traffic, so I assumed it's worked.
Is there a better way of doing this?
Sam
-
Good luck!
-
Very interesting Kristina.
I think I've figured out what's going on.
Our product is a browser based CRM hosted on a secure server, so I believe someone has previously visited our site and clicked on one of the old campaigns and then is either returning directly to our /login page or is clicking on the ? buttons within our browser-based CRM which leads to our website support articles and since our browser-based CRM is on a secure server this would count as direct traffic not referral.
So as you mentioned I will set-up a filter to have these campaigns show up as direct.
Hallelujah!
Sam
-
Hi Sam,
First, to be clear, campaigns will be overwritten if visitors come from any other source, it's just direct traffic that the campaign parameter holds on to. Here's Google's direct quote from its article on campaigns and traffic sources:
Existing campaign or traffic source data for a particular user will be overwritten by new campaign or traffic source data, regardless of the configured timeout period.
If you look at the flowchart a bit below that quote, you'll see that Google starts by looking for new campaign data, then looks for traffic source data, then, if it doesn't find either of those, uses existing campaign data.
That means your theory could still be correct, but only if all of the visits you're still seeing come in are just from direct visits. You can check this theory by using the % New Sessions column - if it's 0%, you're right, these are just returning visitors, and the best I can recommend is that you set up filters to make these show up as "direct." If it's not, though, (and I'm suspecting it's not, because I doubt this would make a large enough number for you to be concerned and reach out for help), you've still got some of those campaign URLs floating around for the public.
Here's how I'd go looking for them:
- Use a third party tool like Screaming Frog or DeepCrawl to triple check that there are no internal links on your site with those old campaign parameters. CMSs can easily miss things like this, so using an outside tool that just tries to find everything helps.
- Search for the original URLs + parameters in Google to see if any affiliates or coupon sites are using those links.
- Check your old emails - did you ever send out these URLs? It's possible that people are still accessing old emails.
- Was this a campaign that could have been shared in any other way? I know that my company often shares shortened URLs, which redirect to URLs with parameters appended. Have you shared any bit.ly or other aliased URLs that are appending those parameters you've tried to get rid of?
I hope this helps! Let me know if you still have any questions, or if anything stumps you along the way.
Best,
Kristina
-
I know it's too obvious, but what about just creating a segment, filtering out that campaign traffic?
-
Thanks, bounce rate is in the early 90% so high but not 100%, exit pages also differ.
Interesting note on the tracking code.
Since if any visitor revisits (whom orginally clicked on one of the campaign links) counts as a session against the old campaign I don't think it's as complicated as people visiting through bookmarks or browser history.
Is there really nothing I can do about these old campaigns coming back ot haunt me!?
-
Hi there.
New view wouldn't help anyhow, because it's tied to the same tracking code. My guess is that either users are getting to those pages through bookmarks or browser history, or those links were indexed somehow and now you're getting hit by bots and crawlers.
Go to campaigns, see what source/medium those sessions are coming from, also check how long those sessions are and the bounce rate. If it looks like it might be crawlers - look into ghost and referral spam filtration. here is the link on how to implement - https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google's AJAX Announcement Impact the likes of AngularJS?
Google's announcement last month about depreciating their AJAX crawl directive and Distilled's recent article have got me thinking a lot about how this change impacts frameworks like AngularJS. For those of you that use or are considering using frameworks like AngularJS, does this change impact you? Has it changed your mind about services like Prerender etc? All discussions relating to AJAX crawling welcome. Some resources to get started: https://prerender.io/js-seo/angularjs-seo-get-your-site-indexed-and-to-the-top-of-the-search-results/ https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/
Web Design | | ecommercebc1 -
URL Help
Will the following urls will be considered as two different urls? 1. www.example.com/key=value1& key2=value2 2. www.example.com/key2=value2 & key=value1
Web Design | | prsntsnh0 -
Is it necessary to Remove 301 redirects from Wordpress after removing the 404 url from Google Webmaster?
There were many 404 urls in my site found by Google Webmaster. I've redirected these urls to the relevant urls with 301 redirect in wordpress. After that I removed these 404 urls from Google Index through Webmaster. "Should I cleanup these 301 redirects from Wordpress or not? ". Help Needed.
Web Design | | SangeetaC0 -
New Google SERPS design - What's Changed?
Has anyone noticed any fall out from the recent redesign of SERP pages by Google? I noticed that there appears to be one less organic result "above the fold" now, so if you were possibly in third or fourth position maybe slight dip in traffic? Any noticeable shift in click through rate with the new bigger font? Also, has anyone noticed if the new design has caused any shift in best practices for on-page meta data like Title tag and description tag counts? I know the Title tag was previously driven by the pixel width of the title in Google SERPS, just curious if that has changed with this redesign.
Web Design | | IrvCo_Interactive0 -
Competitor Rockets to #1 and I'm looking at keyword stuffing. Will Google catch up with it?
We have a competitor whose home page rocketed up to number one, page one, on our key search term after they did a website redesign. They even beat out the original retailer for that position, as they are resellers of the product (not affiliate sales, resale in the secondary market.) They are the first to knock the original seller out of the #1 position. In the past couple of years that I have been doing in-house SEO, they have never ranked on page one for the term. I ran their site through the SEOmoz page grader for the specific search term, loading their page that is ranking, and found that they grade a “B,” but have some alerts for keyword stuffing, (the search term is on the home page 30+ times,) and they have eleven tags on said page. Aside from the two things listed above, they have pretty good site architecture on this new site, and are pretty well branded, etc. Should I expect Google to catch the keyword stuffing and eleven tags, and possibly adjust their rank? Will their keyword stuffing come back to bite them?
Web Design | | Ticket_King0 -
Is my sitemap going to help me attract more visitors?
Hi, As I await my sitemap to go live, can someone tell me the main benefits of it? A Google sitemap that is .xml one. I have a images sitemap also as the site is an e-commerce store. Should I be expecting to see an increase in visitors when I implement it initially? Thanks Will
Web Design | | WillBlackburn0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0