Tracking links and duplicate content
-
Hi all,
I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves.
The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog).
They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue.
What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy?
Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015.
Thank you for any help you can offer!
-
Personally, I would submit a clean sitemap ASAP. It's helpful whenever you upload it, and SEO fixes are best made as soon as possible, otherwise you're just leaving traffic on the table.
Plus, I'm skeptical that a move to https will be fast.
That said, there's no reason why you can't move your site to https without already having a clean XML sitemap for the http version. So it's really up to you.
Sorry, that's a little ambiguous! Such is SEO.
Good luck!
Kristina
-
Hey Kristina,
Thanks for the reply! Yes, I have already gone ahead and changed their URL parameters accordingly. I've asked their developer to go through and canonical all tracking link URLs to the clean URL.
The sitemap is a more complicated issue as they haven't had one created in a little more than a year, so it is very out of date. We are working with them to get a clean version of their sitemap in place after we restructure some of their navigation and content.
Do you think there is value in submitting a clean sitemap (without the tracking links) before switching over to https or just wait until after that change is made?
Thanks again for the reply. This is one of the most complicated sites I've ever tackled. Glad to hear I am on the right track!
-
Hey there,
I think you've given your client some good advice. Just to make sure we're all on the same page about how to handle duplicate content created by tracking parameters:
- Make sure to keep the XML sitemap up to date, and only include the canonical versions of URLs
- Canonical all parameter-ed URLs back to a single source, without parameters
- Mark those tracking parameters as "Doesn't affect page content (tracks usage)" in Google Search Console
The parameter issue is something I would fix ASAP, then tackle https when that comes around. At that point, you'll need to make sure ALL http pages 301 redirect to https versions of the page. I haven't worked with Omniture much, but make sure that doesn't break tracking.
Good luck!
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Track PDF's in Google Analytics
Hi Mozzers, Is it possible to track PDF's via Google Analytics/Google tag manager? I'm not only looking for PDF downloads but for the actual activity when someone opens an interactive PDF document. So would it be possible to have onclick events on buttons in the PDF etc... Many thanks!
Reporting & Analytics | | WeAreDigital_BE
Sander0 -
How to get metrics on Home page image link to youtube video in lightbox
My home page has a explainer video. Is there a way to get metrics or analytics data about who watch or don't watch the video? http://www.furnacefilterscanada.com For example, if I have 100 unique visitors on this page and my video get 33 views, 33% saw the video! I think the data provide by youtube are not accurate, my site has a avarage of 40 visitors/day and my video was saw 84 times sinces it is online, 14 days ago. On that 84 clicks, I most have click 30 times!!! When you lend on my home page, you can't miss it! http://screencast.com/t/ZbIWYl0W I don't believe only a few visitors saw it. Conversation rate has increase. I use Google Analytics, is there a code or something that can be done to get metrics or data more accurate then the .Ananalytics" provide in my youtube control panel? Please HELP! thank you, BigBlaze
Reporting & Analytics | | BigBlaze2050 -
Subdomain tracking codes on subdomain and not root
Afternoon all. I’m pretty sure this is going to be fine but I thought I would seek some confirmation before I action anything. We have a blog subdomain on our site, I have just noticed that the entire blog (built in Wordpress) has no analytics tracking code on it. As this is built in Wordpress I have just logged in on an admin account I think people forgot I had and added the code to the header section as it is a two second job. My question is this – On the main account, subdomain tracking has not been turned on, so the root domain has not got the additional _gaq.push(['_setDomainName', 'rootdomain.co.uk']); line of code in it. I have included this on the blog.rootdomain.co.uk code as is necessary to enable tracking. Will this work or cause tracking issues? I think it should be ok. I don’t want to have to update the root domain code if I can avoid it as that will need to go through the development team and may take weeks as they are swamped and under resourced. Thanks, Rich.
Reporting & Analytics | | Sarbs0 -
Problem with Enhanced Link Attribution
I set up Enhanced Link Attribution yestarday on two my websites, but it still doesn´t work. When I look at numbers of clicks at In-page analytics I see the same numbers of clicks for example on heading, read more button, thumbail of blogposts - so it doesn´t work. My GA code: <script type="<a class="attribute-value">text/javascript</a>"> var _gaq = _gaq || []; var pluginUrl = '//www.google-analytics.com/plugins/ga/inpage_linkid.js'; _gaq.push(['_require', 'inpage_linkid', pluginUrl]); _gaq.push(['_setAccount', 'XXXX']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); script> Could you help me? thanks 🙂
Reporting & Analytics | | mysho0 -
How to track subfolders in GA?
Is it possible to get the visitor statistics for a subfolder (example.com/blog/) in Google Analytics? I already have GA tracking set-up like normal, and visitors do get logged, but to see data only for /blog/ I have to go to Content > Site Content > All pages and search for "/blog/". Should I create a new profile in Google Analytics? Or is there a better way to only see stats for a subfolder?
Reporting & Analytics | | Qon0 -
Total Linking Root Domins
I'm still very new to link building and SEO, (only one month in!) I have a reasonable starting point as our web site is fairly well designed but I have a long way to go to catch up in what is a very competitive market! (Music Retail). My question is regarding Open Site explorer and its ability to accurately reflect back links. Looking at total quantity of unique back link domains Open Site currently has me at 36 domains, Google webmaster tools has me at 187!! The one site that all my 'watched' competitors has is DMOZ, this gives them a great boost on the SEOmoz and open site rankings. I have 3 links on DMOZ and they all show up on my Webmaster tools yet don't show on either SEOmoz or open site. I've left it for a couple of months to see if it caught up but no real change. My worry is that there seems little value in the SEOmoz tools if their not at least as complete as Google webmaster is to allow me a fair comparison! Is there anything wrong with my setup on SEOmoz??
Reporting & Analytics | | rattleanddrum0 -
Tracking analytics on several domains
Hello, I am looking for some advice. I am working with a real estate client. They have a main website, and then several separate agent websites. The agent sites share some content with the main site, as well as having some unique content. My question is what is the best way to track visitors? Should I have separate Google Analytics accounts for each site, or should I use one account and separate profiles?
Reporting & Analytics | | ukao0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110