Duplicate content due to csref
-
Hi,
When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages.
Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results.
Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
-
Yes, to set up rel-canonical properly, every page that could conceivably be tagged with a csref= parameter should have a self-referencing canonical. The tags are easy to set up, in theory, but once you get into a large site and/or CMS, setting them up on dozens or hundreds of pages can be tricky. Ultimately, it's a more effective approach that has some other benefits (like scooping up stray duplicates that may have been created by other URL parameters), but it really depends on your development resources and how complex your site is.
-
Hi,
Thanks for quick and competent reply.
I guess the reason that google only have registred 8 pages, is that many of the pages we have csref on are campaign pages, and they havent "lived" for so long yet.
As i understand you, there are two ways to proceed with this. One beeing informing in Google Webmaster Tools that google should ignore the csref-parameter, and the other beeing the canonical links.
The first is quite straight forward i guess, its just a matter of registrering in Google Webmaster Tools, that all URLs with www.tryg.dk as main domain, should not be followed by Google.
The latter im not that sure of how to proceed with, its a matter of registrering every page with csref with a cannoical link? Or how is the best way to proceed with that.
-
The good news is that you only seem to have about 8 of these pages in the Google index. You can use this query on Google to see them:
site:www.tryg.dk inurl:csref
Ideally, I'd use the canonical tag on those pages to strip out the parameter and de-index any duplicates, but across the site that can be tricky. You could also tell Google Webmaster Tools to ignore the csref parameter via parameter handling - it's not quite as robust a solution, but it's a lot easier to implement.
-
Hi,
Thanks for your reply.
It is excactly a URL generated based on our tracking codes, f.eg. when i look in the list of duplicated content on our page here in SeoMoz, i get the following URLs:
http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html?csref=Disclaimer_Nordea http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html?csref=Bundmenu_Om_Tryg_Vores_Partnere
http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html
The latter beeing the "original" page for this, and the two above beeing page URLs generated by URLs with csrefs, which are generated by our tracking via Omniture.
So my question is how i make sure that it do not have a negative effect on our SEO.
-
Apologies, but I'm not familiar with the csref parameter - could you tell me what information it passes or give me a sample URL (you can make it generic and mask your domain info)?
It sounds like some kind of tracking code, in which case it can definitely start to create duplicate content issues. You could probably use the rel=canonical tag to make Google "collapse" those pages, or you could tell Google to ignore the parameter in Google Webmaster Tools. Neither should impact your tracking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO And Duplicate Content Within The Same Language
Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks
Technical SEO | | Tom3_150 -
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Duplicate Content Issues - Where to start???
Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt
Technical SEO | | MattByrne0 -
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Root domain not resolving to www. Duplicate content?
Hi, I'm working with a domain that stays on the root domain if the www is not included. But if the www is included, it stays with the www. LIke this: example.com
Technical SEO | | HardyIntl
or
www.example.com Of course, they are identical and both go to the same IP. Do search engines consider that to be duplicate content? thanks,
michael0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0