Google Tag Manager
-
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works.
As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced.
Best,
Christopher -
You're right, the codes are unique. They are so small (4 characters) I did not recognize them as unique IDs.
Thanks for the info on the hostname filter as well.
Best,
Christopher -
There is a unique ID embedded in your Tag Manager code, Christopher. It's specific and different for each container you create for each site (i.e. each container has a different ID). So the Tag Manager code on your pages is calling rules from your specific container. Nobody else's rules could affect your account unless you let them add their snippet to your site.
Here's an example of the Tag Manager snippet showing the unique ID In this case anonymized to GTM-XXXX:
<noscript><iframe src="//www.googletagmanager.com/ns.html?id=<strong>GTM-XXXX</strong>"<br />height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript>
This is assuming, of course that you've followed the best practices of creating a new container for each website. If you're using cross-domain tracking or rollup reporting, I suppose to might be possible to implement all within one container even for multiple sites.
All that said - there's nothing to stop another website from maliciously adding your Analytics code to their own website (whether through Tag Manager or regular manual methods) to mess up your stats. The way to avoid this (and best practice for any site) is to ensure you have a hostname filter to remove all GA calls except those from your own approved hostnames. This also helps filter out "accidental" cases like where somebody has scraped your page contents including your GA code. You'd be surprised how often this happens.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
Multiple H1 Tags for different section on one webpage in HTML5 Website? Should I have only one?
https://www.tcs.com website has multiple H1 tag on around 40 out of 1000 pages. Webpage has different section and each section is important that's why used multiple H1 tag (one h1 for each section) **I understand, google will not penalize for multiple H1. But having multiple H1, for site like tcs, is it good or should use only one H1 tag? **Pls check https://www.tcs.com to see heading tag(6 heading tag used).
Web Design | | JayprakashSEO0 -
Fetch as Google not showing Waypoints.js on scroll animation
So I noticed that my main content underneath 4 reasons to choose LED Habitats did not show up in Fetch as Google as well as a few other sections. The site being brand new, so I'm not sure how this will be indexed. What happens is, as the user scrolls the content is brought in using Waypoints and Animate.css which offers an engaging yet simple user experience. I'm just afraid that If the content doesn't show up in "Fetch as Google" in webmaster tools that this content will never be found / indexed by Google. There are thousands of sites that use this library, I'm just curious what I'm doing wrong.. or what I can do. Is there a way for me to keep the simple animations but keep Google Happy at the same time? I took a screen shot of "Fetch as Google" and you can see blatant missing sections which are the sections animated by the waypoints library. Thanks for listening! Robert ZqgLWHi
Web Design | | swarming0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
Advice needed: Google crawling for single page applicartions with java script
Hi Moz community,we have a single page application (enjoywishlist.com) with a lot of content in java script light boxes. There is a lot of valuable content embedded but google can not crawl the content and we can missing out on some opportunities as a result. I was wondering if someone was able to solve a similar issue (besides moving the content from the java script to the HTML body). There appears to be a few services sprouting up to handle single page applications and crawling in google.http://getseojs.com/https://prerender.io/Did anyone use these services? Some feedback would be much appreciated!ThanksAndreas
Web Design | | AndreasD0 -
Does Google count the domain name in its 115-character "ideal" URL length?
I've been following various threads having to do with URL length and Google's happiness therewith and have yet to find an answer to the question posed in the title. Some answers and discussions have come close, but none I've found have addressed this with any specificity. Here are four hypothetical URLs of varying lengths and configurations: EXAMPLE ONE:
Web Design | | RScime25
my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (115 characters) EXAMPLE TWO: sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (126 characters) EXAMPLE THREE: www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (130 characters) EXAMPLE FOUR: http://www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (137 characters) Assuming the examples contain appropriate keywords and are linked to appropriate anchor text (etc.,) how would Google look upon each? All I've been able to garner thus far is that URLs should be as short as possible while still containing and contextualizing keywords. I have 500+ URLs to review for the company I work for and could use some guidance; yes, I know I should test, but testing is problematical to the extreme; I look to the collective/accumulated wisdom of the MOZVerse for help. Thanks.1