Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Capturing Source Dynamically for UTM Parameters
-
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics?
We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website.
If we set a permanent utm_source, it would appear the same for all incoming traffic.
Thanks in advance!
-
@peteboyd said in Capturing Source Dynamically for UTM Parameters:
Thanks in advance!
UTM (Urchin Tracking Module) parameters are tags that you can add to the end of a URL in order to track the effectiveness of your marketing campaigns. These parameters are used by Google Analytics to help you understand how users are interacting with your website and where they are coming from.
There are five different UTM parameters that you can use:
utm_source: This parameter specifies the source of the traffic, such as "google" or "Facebook".
utm_medium: This parameter specifies the medium of the traffic, such as "cpc" (cost-per-click) or "social".
utm_campaign: This parameter specifies the name of the campaign, such as "spring_sale" or "promotion".
utm_term: This parameter specifies the term or keywords used in the campaign, such as "shoes" or "dress".
utm_content: This parameter specifies the content of the ad, such as the headline or the call-to-action.
To capture the source dynamically for UTM parameters, you can use JavaScript to get the value of the document. referrer property. This property returns the URL of the page that is linked to the current page. You can then use this value to set the utm_source parameter dynamically.
For example, you might use the following code to set the utm_source parameter based on the referring URL:
Copy code
var utmSource = '';if (document.referrer.indexOf('google') !== -1) {
utmSource = 'google';
} else if (document.referrer.indexOf('facebook') !== -1) {
utmSource = 'facebook';
}// Add the utm_source parameter to the URL
var url = 'http://www.example.com?utm_source=' + utmSource;
This code will set the utm_source parameter to "google" if the user came to the page from a Google search or to "Facebook" if the user came to the page from Facebook. If the user came to the page from another source, the utm_source parameter will be left empty.You can then use this modified URL in your marketing campaigns to track the effectiveness of your campaigns and understand where your traffic is coming from.
-
@peteboyd you can refer to this tutorial: https://www.growwithom.com/2020/06/16/track-dynamic-traffic-google-tag-manager/
Should meet your requirements perfectly - using GTM to replace a static value with the url in your UTM Source.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | Nov 18, 2024, 10:15 AM | davidvogel0 -
Unsolved Question about a Screaming Frog crawling issue
Hello, I have a very peculiar question about an issue I'm having when working on a website. It's a WordPress site and I'm using a generic plug in for title and meta updates. When I go to crawl the site through screaming frog, however, there seems to be a hard coded title tag that I can't find anywhere and the plug in updates don't get crawled. If anyone has any suggestions, thatd be great. Thanks!
Technical SEO | Jan 4, 2024, 10:39 AM | KyleSennikoff0 -
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | Nov 18, 2021, 2:49 PM | Michael_M2 -
301 redirect from dynamic url to static page
Hi, i want to redirect from this old link http://www.g-store.gr/product_info.php?products_id=1735/ to this one https://www.g-store.gr/golf-toualetas.html I have done several attempts but with no result. I anyone can help i will appreciate. My website runs in an Apache server with cpanel. Thank you
Technical SEO | Dec 17, 2018, 4:22 AM | alstam0 -
Does Google read dynamic canonical tags?
Does Google recognize rel=canonical tag if loaded dynamically via javascript? Here's what we're using to load: <script> //Inject canonical link into page head if (window.location.href.indexOf("/subdirname1") != -1) { canonicalLink = window.location.href.replace("/kapiolani", ""); } if (window.location.href.indexOf("/subdirname2") != -1) { canonicalLink = window.location.href.replace("/straub", ""); } if (window.location.href.indexOf("/subdirname3") != -1) { canonicalLink = window.location.href.replace("/pali-momi", ""); } if (window.location.href.indexOf("/subdirname4") != -1) { canonicalLink = window.location.href.replace("/wilcox", ""); } if (canonicalLink != window.location.href) { var link = document.createElement('link'); link.rel = 'canonical'; link.href = canonicalLink; document.head.appendChild(link); } script>
Technical SEO | Aug 15, 2017, 3:53 PM | SoulSurfer80 -
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | Mar 29, 2017, 11:30 AM | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | May 8, 2015, 11:05 AM | BT20090 -
Does Title Tag location in a page's source code matter?
Currently our meta description is on line 8 for our page - http://www.paintball-online.com/Paintball-Guns-And-Markers-0Y.aspx
Technical SEO | Sep 25, 2014, 2:37 AM | IstoresincThe title tag, however sits below a bunch of code on line 237
Does the location of the title tag, meta tags, and any structured data have any influence with respect to SEO and search engines? Put another way, could we benefit from moving the title tag up to the top? I "surfed 'n surfed" and could not find any articles about this. I would really appreciate any help on this as our site got decimated organically last May and we are looking for any help with SEO. NIck
0