A campaign ghost keeps returning to my Google Analytics - Help!
-
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site).
I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign.
Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing.
However they are still showing as sessions even in the new view (created after removing the campaign links in question).
Is there anyway to stop this happening!? I want to be able to report on sessions correctly.
Thanks,
Sam
-
Thanks Kristina,
I set-up the filter in the following way:
Filter type - Custom
Search and Replace
Filter field - Campaign nameSearch string - 'nameofmycampaignimremoving'
Replace string -
I figured by adding in the campaign name but not replacing it with anything, Google Analytics should now pick this traffic up as direct.
The campaign is no longer showing in my traffic, so I assumed it's worked.
Is there a better way of doing this?
Sam
-
Good luck!
-
Very interesting Kristina.
I think I've figured out what's going on.
Our product is a browser based CRM hosted on a secure server, so I believe someone has previously visited our site and clicked on one of the old campaigns and then is either returning directly to our /login page or is clicking on the ? buttons within our browser-based CRM which leads to our website support articles and since our browser-based CRM is on a secure server this would count as direct traffic not referral.
So as you mentioned I will set-up a filter to have these campaigns show up as direct.
Hallelujah!
Sam
-
Hi Sam,
First, to be clear, campaigns will be overwritten if visitors come from any other source, it's just direct traffic that the campaign parameter holds on to. Here's Google's direct quote from its article on campaigns and traffic sources:
Existing campaign or traffic source data for a particular user will be overwritten by new campaign or traffic source data, regardless of the configured timeout period.
If you look at the flowchart a bit below that quote, you'll see that Google starts by looking for new campaign data, then looks for traffic source data, then, if it doesn't find either of those, uses existing campaign data.
That means your theory could still be correct, but only if all of the visits you're still seeing come in are just from direct visits. You can check this theory by using the % New Sessions column - if it's 0%, you're right, these are just returning visitors, and the best I can recommend is that you set up filters to make these show up as "direct." If it's not, though, (and I'm suspecting it's not, because I doubt this would make a large enough number for you to be concerned and reach out for help), you've still got some of those campaign URLs floating around for the public.
Here's how I'd go looking for them:
- Use a third party tool like Screaming Frog or DeepCrawl to triple check that there are no internal links on your site with those old campaign parameters. CMSs can easily miss things like this, so using an outside tool that just tries to find everything helps.
- Search for the original URLs + parameters in Google to see if any affiliates or coupon sites are using those links.
- Check your old emails - did you ever send out these URLs? It's possible that people are still accessing old emails.
- Was this a campaign that could have been shared in any other way? I know that my company often shares shortened URLs, which redirect to URLs with parameters appended. Have you shared any bit.ly or other aliased URLs that are appending those parameters you've tried to get rid of?
I hope this helps! Let me know if you still have any questions, or if anything stumps you along the way.
Best,
Kristina
-
I know it's too obvious, but what about just creating a segment, filtering out that campaign traffic?
-
Thanks, bounce rate is in the early 90% so high but not 100%, exit pages also differ.
Interesting note on the tracking code.
Since if any visitor revisits (whom orginally clicked on one of the campaign links) counts as a session against the old campaign I don't think it's as complicated as people visiting through bookmarks or browser history.
Is there really nothing I can do about these old campaigns coming back ot haunt me!?
-
Hi there.
New view wouldn't help anyhow, because it's tied to the same tracking code. My guess is that either users are getting to those pages through bookmarks or browser history, or those links were indexed somehow and now you're getting hit by bots and crawlers.
Go to campaigns, see what source/medium those sessions are coming from, also check how long those sessions are and the bounce rate. If it looks like it might be crawlers - look into ghost and referral spam filtration. here is the link on how to implement - https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When rel canonical tag used, which page does Google considers for ranking and indexing? A/B test scenario!
Hi Moz community, We have redesigned our website and launched for A/B testing using canonical tags from old website to new website pages, so there will be no duplicate content issues and new website will be shown to the half of the website visitors successfully to calculate the metrics. However I wonder how actually Google considers it? Which pages Google will crawl and index to consider for ranking? Please share your views on this for better optimisation. Thanks
Web Design | | vtmoz0 -
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
Google pagespeed / lazy image load
Hi, we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question: Does the "lazy load" feature for images harm the ranking? The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google? Any expirience about that? Best wishes, Georg.
Web Design | | GeorgFranz0 -
Any way of showing missed sales in Google Analytics?
Sit down, this might get a little complicated... I was approached by a design company to do some SEO work a client of theirs. Usually, this stuff is white label but I have direct contact with the client as the design agency felt it was easier for me to do this. The website is performing really well and looking at the sales funnel, I'm getting people wanting to buy. BUT, and here's the problem, everything falls apart because of the way the check out works. It's appalling. The customer has to register to buy a product, there's no guest check out or anything. The checkout button is right below the fold and you'd miss it completely if you didn't actually search for it. Basically, it's losing the client money. Last month alone there were 300~ people entering the conversion funnel and NONE of them complete it. I've been talking with the design company and they basically saying that it's too much work for them to change it, it's a signed off project blah blah. UI reports have been conducted and sent to them but still nothing. I have the client asking (a great client, obviously wondering why there is a lack of return on his investment) why he isn't making money. He's asking me because I'm the guy thats meant to be getting him the cash back. I keep saying to the design agency the problems and that it's never going to make money. The potential is massive. But thats my problem. Is there ANY way in GA to calculate the missed sales? I know that I can view the total amount made when the customer successfully checks out but I need figures to present that I'm leading the horse to water, but the check out system is preventing it from drinking. tl;dr I need to show client/design agency missed sales due to poorly built checkout system. Cheers!
Web Design | | jasonwdexter0 -
I need help with international SEO for two sites?
I'll try to keep this clear... I am working with an company based in Germany, they own company.com/de and company.com/en, and that's how they are currently structuring their domains. They also own companyusa.com that they really want to show up in USA only. They want to keep company.com/en for England/english speaking Europe and company.com/de for their German audience in Germany. They are wanting us to optimize/SEO for companyusa.com, and they want that URL to show up as the top google search in the USA for their "company" keyword. What is showing up now is www.company.com/en 1st in Google because it's been around longer and it has more domain authority. What is the best practice for us optimize companyusa.com so that it is the top dog in the USA while not messing up the other domains? Should we merge? Subfolders all around? Thanks for all the input.
Web Design | | Rocket.Fuel0 -
Need help in website URL Structure
I have been working on a brand new website currently it is live but I have disallow Googlebots temporarily as I dint want any negative impact. The business of the site is to generate leads , they install and sell Stairlifts and used Stairlifts. There are two main categories New Stairlifts and Reconditioned Stairlifts Currently the URL for new Stairlifts is : http://willowstairlifts.co.uk/stairlifts/ and for Reconditioned Stairlifts is: http://willowstairlifts.co.uk/reconditioned-stairlifts/ My concerns are that the word Stairlifts is mentioned twice in the urls so is it going to have a negative impact or panda penalty? I am thinking of changing them to http://willowstairlifts.co.uk/new/ and the product pages to display as http://willowstairlifts.co.uk/new/brooks/ Currently its http://willowstairlifts.co.uk/stairlifts/brooks/ Same with reconditioned Stairlifts I like to change it to : http://willowstairlifts.co.uk/reconditioned Also its product pages to http://willowstairlifts.co.uk/reconditioned/brooks/ As currently its http://willowstairlifts.co.uk/reconditioned-stairlifts/brooks/ Thanks
Web Design | | conversiontactics0 -
Wordpress Pages not indexing in Google
Hi, I've created a Wordpress site for my client. I've produced 4 content pages and 1 home page but in my sitemap it only says I have 1 page indexed. Also SEOmoz only finds 1 page. I'm lost on what the problem could be. The domain name is www.dobermandeen.co.uk Many thanks for any help. Alex
Web Design | | SeoSheikh0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0