Tracking links and duplicate content
-
Hi all,
I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves.
The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog).
They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue.
What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy?
Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015.
Thank you for any help you can offer!
-
Personally, I would submit a clean sitemap ASAP. It's helpful whenever you upload it, and SEO fixes are best made as soon as possible, otherwise you're just leaving traffic on the table.
Plus, I'm skeptical that a move to https will be fast.
That said, there's no reason why you can't move your site to https without already having a clean XML sitemap for the http version. So it's really up to you.
Sorry, that's a little ambiguous! Such is SEO.
Good luck!
Kristina
-
Hey Kristina,
Thanks for the reply! Yes, I have already gone ahead and changed their URL parameters accordingly. I've asked their developer to go through and canonical all tracking link URLs to the clean URL.
The sitemap is a more complicated issue as they haven't had one created in a little more than a year, so it is very out of date. We are working with them to get a clean version of their sitemap in place after we restructure some of their navigation and content.
Do you think there is value in submitting a clean sitemap (without the tracking links) before switching over to https or just wait until after that change is made?
Thanks again for the reply. This is one of the most complicated sites I've ever tackled. Glad to hear I am on the right track!
-
Hey there,
I think you've given your client some good advice. Just to make sure we're all on the same page about how to handle duplicate content created by tracking parameters:
- Make sure to keep the XML sitemap up to date, and only include the canonical versions of URLs
- Canonical all parameter-ed URLs back to a single source, without parameters
- Mark those tracking parameters as "Doesn't affect page content (tracks usage)" in Google Search Console
The parameter issue is something I would fix ASAP, then tackle https when that comes around. At that point, you'll need to make sure ALL http pages 301 redirect to https versions of the page. I haven't worked with Omniture much, but make sure that doesn't break tracking.
Good luck!
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be causing our linking domains and inbound links to decline?
I am noticing a decline in the number of our linking domains and inbound links from month to month. It isn't drastic but looking like a trend. Any reason why this would happen? I'm not sure where to start. Thanks!
Reporting & Analytics | | amanda_feagle0 -
Fresh Content Still As important?
We have an internal debate, that perhaps y'all can help us resolve. In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor. But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off. Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him) So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part) or Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month? And can you prove your argument? (we need cold hard facts to be convinced 🙂
Reporting & Analytics | | Britewave0 -
Large content snippets showing up as keywords?
I've started to notice something very strange: the search keywords report in analytics show a bunch of instances where a person copied large snippets of our site content and then pasted it into the search box. Half these searches are coming from the US and half from...India. I'm worried that this may be the sign of a competitor attempting to perform negative SEO on our site (though admittedly I don't know how). Anyone seen anything like this? Advice? Thanks!!
Reporting & Analytics | | SarahLK0 -
Cross Domain Tracking
I want to track across domains, but also track as a virtual pageview anytime someone clicks on the link to another domain. So currently I have, for example: Checkout Now! As per Google's instructions, I need to have the link set to: Checkout Now! But this will obviously get rid of my virtual pageview. Is there a way I can do both?
Reporting & Analytics | | TeachersMutualBank0 -
Tracking an onpage 'event'
Hi all Wondering if anyone could help out with this one please. My client is a government backed free internet safety website and in the next few days they will be launching an update on all of their pages which will let people know if their browser is out of date. For example, when you go to their site you will get a message advising you to upgrade your browser for security reasons. They have employed the following code to check the browser The client are keen to know how many times in a period this message is shown to users. Any idea how one would go about tracking this please. Would it involve some custom GA work, would I be able to track the hits on https://wxxxx.org/javascript/update.js in GA? I'm a little stumped. Obviously I can tell how many people loaded the page but not sure how to work out what % of them see the javascript Many thanks for your help Carl
Reporting & Analytics | | GrumpyCarl0 -
Cookies/Tracking/Code developement
So. I have an online application form, on the website. It's maked with php, html and css. What i like to place on the email, is a cookie tracking system. For example, when someone search on the google for " web design ", and then see my result on google organic, comes the visitor to my site, charge the fields of application forms and click send. Than, me like an admin, i receive an email with informations of a customer, i see name/surname/etc.. but i like to integrate is for example, like to see a: IP from Visitors, who charged the fields Referrerr, if google, like to see which query he typed. How can i do this please?
Reporting & Analytics | | leadsprofi0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Need Tips to Track Advertising of Domain on my Car
Hi, I'm working on building a new site. http://www.pilatesboisfranc.com/ I also bought the domain .ca (the business will be in canada) .net .org and .info All the domains are redirect to the .com This site is about my wife new business, a Pilates Studio. We would like to advertise the site on our personal car, using vinyl letters to display the domain name. Is there a way tu use the (.ca) http://www.pilatesboisfranc.ca advertise on my car and track that advertising using Analytics? I know I can use a URL something like http://www.pilatesboisfranc.com/health and track the hit, but using a shorter URL is the key. Can you help? Thank you, BigBlaze
Reporting & Analytics | | BigBlaze2050