A campaign ghost keeps returning to my Google Analytics - Help!
-
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site).
I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign.
Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing.
However they are still showing as sessions even in the new view (created after removing the campaign links in question).
Is there anyway to stop this happening!? I want to be able to report on sessions correctly.
Thanks,
Sam
-
Thanks Kristina,
I set-up the filter in the following way:
Filter type - Custom
Search and Replace
Filter field - Campaign nameSearch string - 'nameofmycampaignimremoving'
Replace string -
I figured by adding in the campaign name but not replacing it with anything, Google Analytics should now pick this traffic up as direct.
The campaign is no longer showing in my traffic, so I assumed it's worked.
Is there a better way of doing this?
Sam
-
Good luck!
-
Very interesting Kristina.
I think I've figured out what's going on.
Our product is a browser based CRM hosted on a secure server, so I believe someone has previously visited our site and clicked on one of the old campaigns and then is either returning directly to our /login page or is clicking on the ? buttons within our browser-based CRM which leads to our website support articles and since our browser-based CRM is on a secure server this would count as direct traffic not referral.
So as you mentioned I will set-up a filter to have these campaigns show up as direct.
Hallelujah!
Sam
-
Hi Sam,
First, to be clear, campaigns will be overwritten if visitors come from any other source, it's just direct traffic that the campaign parameter holds on to. Here's Google's direct quote from its article on campaigns and traffic sources:
Existing campaign or traffic source data for a particular user will be overwritten by new campaign or traffic source data, regardless of the configured timeout period.
If you look at the flowchart a bit below that quote, you'll see that Google starts by looking for new campaign data, then looks for traffic source data, then, if it doesn't find either of those, uses existing campaign data.
That means your theory could still be correct, but only if all of the visits you're still seeing come in are just from direct visits. You can check this theory by using the % New Sessions column - if it's 0%, you're right, these are just returning visitors, and the best I can recommend is that you set up filters to make these show up as "direct." If it's not, though, (and I'm suspecting it's not, because I doubt this would make a large enough number for you to be concerned and reach out for help), you've still got some of those campaign URLs floating around for the public.
Here's how I'd go looking for them:
- Use a third party tool like Screaming Frog or DeepCrawl to triple check that there are no internal links on your site with those old campaign parameters. CMSs can easily miss things like this, so using an outside tool that just tries to find everything helps.
- Search for the original URLs + parameters in Google to see if any affiliates or coupon sites are using those links.
- Check your old emails - did you ever send out these URLs? It's possible that people are still accessing old emails.
- Was this a campaign that could have been shared in any other way? I know that my company often shares shortened URLs, which redirect to URLs with parameters appended. Have you shared any bit.ly or other aliased URLs that are appending those parameters you've tried to get rid of?
I hope this helps! Let me know if you still have any questions, or if anything stumps you along the way.
Best,
Kristina
-
I know it's too obvious, but what about just creating a segment, filtering out that campaign traffic?
-
Thanks, bounce rate is in the early 90% so high but not 100%, exit pages also differ.
Interesting note on the tracking code.
Since if any visitor revisits (whom orginally clicked on one of the campaign links) counts as a session against the old campaign I don't think it's as complicated as people visiting through bookmarks or browser history.
Is there really nothing I can do about these old campaigns coming back ot haunt me!?
-
Hi there.
New view wouldn't help anyhow, because it's tied to the same tracking code. My guess is that either users are getting to those pages through bookmarks or browser history, or those links were indexed somehow and now you're getting hit by bots and crawlers.
Go to campaigns, see what source/medium those sessions are coming from, also check how long those sessions are and the bounce rate. If it looks like it might be crawlers - look into ghost and referral spam filtration. here is the link on how to implement - https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google pagespeed / lazy image load
Hi, we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question: Does the "lazy load" feature for images harm the ranking? The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google? Any expirience about that? Best wishes, Georg.
Web Design | | GeorgFranz0 -
Advice needed: Google crawling for single page applicartions with java script
Hi Moz community,we have a single page application (enjoywishlist.com) with a lot of content in java script light boxes. There is a lot of valuable content embedded but google can not crawl the content and we can missing out on some opportunities as a result. I was wondering if someone was able to solve a similar issue (besides moving the content from the java script to the HTML body). There appears to be a few services sprouting up to handle single page applications and crawling in google.http://getseojs.com/https://prerender.io/Did anyone use these services? Some feedback would be much appreciated!ThanksAndreas
Web Design | | AndreasD0 -
Did i got hit from some google updates.
Hello everybody, i got a problem and i hope someone can clear it up for me. my root domain authority is 42 and home page is 52 (jumped there only yesterday) ,while my google page rank is still PR2 (same for 3 month already). 1 month ago i changed my home page design (not the text) and since then my home page just disappeared from the search engines. can somebody look on my website www.kspiercing.com , and tell me if i got hit by some panda ,koala,penguin or some other sweet Google animal . thank you very much.
Web Design | | kspiercing0 -
Best way to add Google Analytics to a Joomla!
I am having difficulties installing (and getting it to work) google analytics to Joomla! 2.5.8 site. I have it working on some sites but then not on others even though I have installed it the same. Is there a recommended or easy way to do this that works?
Web Design | | Atlanta-SMO1 -
Google Analytics
Hi Folks I'm currently measuring multiple goals with Google analytics. At the end of the month I compare the Analytics numbers with my actual, from a diffident data base, and there is often a variance. My questions are; 1- Is there an accepted Delta % in goals eg: + or - 10% 2- Is there a resource that anybody found incredibly useful on this topic 3- does anybody have any tips on trouble shooting here? Thank you in advance
Web Design | | Intrested0 -
URLs with Hashtags - Does Google Index Them?
Hi there, I have a potential issue with a site whereby all pages are dynamically populated using Javascript. Thus, an example of an URL on their site would be www.example.com/#!/category/product. I have read lots of conflicting information on the web - some says Google will ignore everything after the hashtag; other people say that Google will now index everything after the hashtag. Does anybody have any conclusive information about this? Any links to Google or Matt Cutts as confirmation would be brilliant. P.S. I am aware about the potential issue of duplicate content, but I can assure you that has been dealt with. I am only concerned about whether Google will index full URLs that contain hashtags. Thanks all! Mark
Web Design | | markadoi840 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0