Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Capturing Source Dynamically for UTM Parameters
-
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics?
We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website.
If we set a permanent utm_source, it would appear the same for all incoming traffic.
Thanks in advance!
-
@peteboyd said in Capturing Source Dynamically for UTM Parameters:
Thanks in advance!
UTM (Urchin Tracking Module) parameters are tags that you can add to the end of a URL in order to track the effectiveness of your marketing campaigns. These parameters are used by Google Analytics to help you understand how users are interacting with your website and where they are coming from.
There are five different UTM parameters that you can use:
utm_source: This parameter specifies the source of the traffic, such as "google" or "Facebook".
utm_medium: This parameter specifies the medium of the traffic, such as "cpc" (cost-per-click) or "social".
utm_campaign: This parameter specifies the name of the campaign, such as "spring_sale" or "promotion".
utm_term: This parameter specifies the term or keywords used in the campaign, such as "shoes" or "dress".
utm_content: This parameter specifies the content of the ad, such as the headline or the call-to-action.
To capture the source dynamically for UTM parameters, you can use JavaScript to get the value of the document. referrer property. This property returns the URL of the page that is linked to the current page. You can then use this value to set the utm_source parameter dynamically.
For example, you might use the following code to set the utm_source parameter based on the referring URL:
Copy code
var utmSource = '';if (document.referrer.indexOf('google') !== -1) {
utmSource = 'google';
} else if (document.referrer.indexOf('facebook') !== -1) {
utmSource = 'facebook';
}// Add the utm_source parameter to the URL
var url = 'http://www.example.com?utm_source=' + utmSource;
This code will set the utm_source parameter to "google" if the user came to the page from a Google search or to "Facebook" if the user came to the page from Facebook. If the user came to the page from another source, the utm_source parameter will be left empty.You can then use this modified URL in your marketing campaigns to track the effectiveness of your campaigns and understand where your traffic is coming from.
-
@peteboyd you can refer to this tutorial: https://www.growwithom.com/2020/06/16/track-dynamic-traffic-google-tag-manager/
Should meet your requirements perfectly - using GTM to replace a static value with the url in your UTM Source.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Unsolved MOZ API to Google Data Studio
Dopes MOZ have anything like this? https://www.semrush.com/features/google-data-studio-connector/
Product Support | | WebMarkets0 -
Sudden Drop in Mobile Core Web Vitals
Web Vitals Screengrab.PNG For some reason, after all URLs being previously classified as Good, our Mobile Web Vitals report suddenly shifted to the above, and it doesn't correspond with any site changes on our end. Has anyone else experience something similar or have any idea what might have caused such a shift? Curiously I'm not seeing a drop in session duration, conversion rate etc. for mobile traffic despite the seemingly sudden change.
Technical SEO | | rwat0 -
Cant find source of redirect
Hey guys, I have a bizarre situation on my hands. I have a URL that is being wonky. The url is redirecting to another url and the 301 redirect is not in my htaccess. There is a 301 redirect in my htaccess but is being overwritten by something else, i.e. whatever is happening in above. So basically URL A should be redirecting to URL B but instead its going to URL C. I know we were not hacked, it's not redirecting to a strange bizarre domain. I have also disabled all of our plugins that redirect (to my knowledge) Any thoughts would be great!
Technical SEO | | HashtagHustler0 -
Subdomain as News Section instead of Source in Google News?
Hi, trying to dig into Google News for a large site, mostly containing news.
Technical SEO | | m.m
The structure of the site network is subdomain.domain.se, and each subdomain has it's own brand with it's own news: x.domain.se
y.domain.se
z.domain.se
etc... Each brand/subdomain is more or less to equate with its own subjectfield/section. In Google News every subdomain is configured with it's own Site Source url, but also having the set up with one section with the same url. It seems like they're getting conflicts in Google News, Google can't always figure out which news article to which brand. Example: an article owned by brand A, but it is sometimes happens that articles getting labeled as brand B in the news SERP, though the link takes you correctly to brand A. I am thinking that this config in News Publisher Center may be a problem? Anyone having any thoughts if that would be better if we delete all source urls except for domain.se-brand and then put all the other subdomains as sections? www.domain.se x.domain.se y.doamin.se z.domain.se Any smart thoughts on this one? Or anything else that could make this wrong labeling (all content included images are hosted in same domain for example). Regards,
Magnus0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0