Page Tracking using Custom URLs - is this viable?
-
Hi Moz community!
I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear.
Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them.
As an example:
-
This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/
-
But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” )
-
It is this custom URL that we use within GA to look up metrics about this page.
-
This is just one example of many across our site setup to do the same thing
-
Here is a second example:
- Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/
- Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/
- NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose.
Main Questions:
- Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking)
- Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad)
- I cannot find any reference to this method anywhere on the InterWebs
- If method is not normal: Any recommendations on a solution to address this?
Potential Problems?
- GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this?
- The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us.
Thank you in advance for any insight and/or advice.
Chris
-
-
any help?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tracking PDF downloads from SERP clicks
For the longest time, our company policy has always been to put PDFs in a secure folder (hence they are not indexed and do not appear in search results). After evaluating this strategy, there has been clamor in recent months to allow Google to index our whitepapers. My question: Once PDFs start appearing in search results, what is the best way to start tracking clicks due to these "downloads"?
Reporting & Analytics | | marshdigitalmarketing0 -
Would updating Meta Titles affect Google analytics tracking?
Hi All, I need a little bit of help. We need to optimize our blog's articles Meta titles for SEO which all exceed 100 characters. I was told that if we change the titles, google analytics would split the tracking pages and count the data as 2 pages (old title and new title). Has any of you have this experience before and if so, is there a way to avoid google analytics counting this as two pages? Thanks in advance! Viviana http://analyteguru.com/ http://analyteguru.com/ http://analyteguru.com/
Reporting & Analytics | | mchoi0 -
Tracking Clicks on a Global Header Across Multiple Sites
Hey All, A particular client has multiple websites and we're planning on implementing a global header across 15+ sites. I've been looking for a way to track the clicks on this global header across all sites (that is that they are summed up), what's the best way to go about this if I am using Google Analytics (I know Adobe site catalyst could do this no problem with some advanced tweaking), any ideas? I could do the general click tracking route and tag every link but that will only help me if I do that for each site (that being said, if the global header for all sites pulls from a single HTML, then tagging it would technically count all the clicks from all the sites, the only caveat being that I'd have to pick which Google analytics profile I'd want to track the header with). Thoughts? Thanks!
Reporting & Analytics | | EvansHunt0 -
Google Analytics tracking code for completely new website
Hello! I use Google Analytics to monitor our company's websites. In the coming weeks we will roll out a brand new website for one of our brands to replace its current site. The domain name will remain the same. My question is, should I use the same tracking code that's been used on the existing site for the new site, or is there a benefit to creating a new tracking code? Thanks in advance!
Reporting & Analytics | | SmileMoreSEO
Erik0 -
How to Track Google Local Places in Google Analytics?
I have read many articles on how to track google local places through google analytics. Each article I have read show a different way of setting up google analytics and using tags in google local places. Wondering if anyone as up to date information on this and what would be the best practice to track data from google local lisitngs in google analytics Thanks Arthur
Reporting & Analytics | | VivaArturo0 -
Why is the page title sometimes missing in the API results?
Why is the page title sometimes missing in the API results? It exists on the page and in the "keyword difficulty tool", but missing in the API. It is never returned when upa= 1For example:stdClass Object(
Reporting & Analytics | | kirza
[fmrp] => 7.05087881216
[fmrr] => 5.05983022077E-6
[pda] => 91.758307504
[ueid] => 0
[ufq] => www.yellowpages.com/
[uid] => 0
[umrp] => 0
[umrr] => 0
[upa] => 1
[upl] => yellowpages.com/
[us] => 0
[ut] =>
[uu] =>
) {"fmrp":5.679443875178763,"fmrr":2.052551993215746e-07,"pda":70.05750830306422,"ueid":0,"ufq":"education-portal.com/","uid":0,"umrp":0,"umrr":0,"upa":1,"upl":"education-portal.com/","us":0,"ut":"","uu":""}How to get the titles for these pages?0 -
Webmaster Not found URLs
Dear All, I would really like help with this. Due to some unknown reason (another thread is open for this reason), my google webmaster is showing 7000 not found URL's. Now, when i try to find out the day these broken URLs were detected, webmaster is showing dates between November 2011 to December 2nd, 2011. I havent found a single not found error showing after 2nd December 2011. So does that mean that the mistake has been solved? Because daily webmaster is adding 200-300 not found URLs. Along with this, my traffic has dropped drastically since 12th December and has still not recovered. Are these not found URLs the reason for this sudden traffic drop? If so, then i m ready to find someone for paid seo to remove this error. I would love to have some concrete answers for these questions. Thanksss
Reporting & Analytics | | hith2340 -
Tracking pages in two separate analytics accounts
Hi All, I'm trying to track some pages on one website in two separate Google Analytics accounts. Has anybody done this before that could help with the tracking code? Thanks in advance, Elias
Reporting & Analytics | | A_Q0