Duplicate content due to csref
-
Hi,
When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages.
Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results.
Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
-
Yes, to set up rel-canonical properly, every page that could conceivably be tagged with a csref= parameter should have a self-referencing canonical. The tags are easy to set up, in theory, but once you get into a large site and/or CMS, setting them up on dozens or hundreds of pages can be tricky. Ultimately, it's a more effective approach that has some other benefits (like scooping up stray duplicates that may have been created by other URL parameters), but it really depends on your development resources and how complex your site is.
-
Hi,
Thanks for quick and competent reply.
I guess the reason that google only have registred 8 pages, is that many of the pages we have csref on are campaign pages, and they havent "lived" for so long yet.
As i understand you, there are two ways to proceed with this. One beeing informing in Google Webmaster Tools that google should ignore the csref-parameter, and the other beeing the canonical links.
The first is quite straight forward i guess, its just a matter of registrering in Google Webmaster Tools, that all URLs with www.tryg.dk as main domain, should not be followed by Google.
The latter im not that sure of how to proceed with, its a matter of registrering every page with csref with a cannoical link? Or how is the best way to proceed with that.
-
The good news is that you only seem to have about 8 of these pages in the Google index. You can use this query on Google to see them:
site:www.tryg.dk inurl:csref
Ideally, I'd use the canonical tag on those pages to strip out the parameter and de-index any duplicates, but across the site that can be tricky. You could also tell Google Webmaster Tools to ignore the csref parameter via parameter handling - it's not quite as robust a solution, but it's a lot easier to implement.
-
Hi,
Thanks for your reply.
It is excactly a URL generated based on our tracking codes, f.eg. when i look in the list of duplicated content on our page here in SeoMoz, i get the following URLs:
http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html?csref=Disclaimer_Nordea http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html?csref=Bundmenu_Om_Tryg_Vores_Partnere
http://www.tryg.dk/om-tryg/fakta-om-tryg/samarbejdspartnere/index.html
The latter beeing the "original" page for this, and the two above beeing page URLs generated by URLs with csrefs, which are generated by our tracking via Omniture.
So my question is how i make sure that it do not have a negative effect on our SEO.
-
Apologies, but I'm not familiar with the csref parameter - could you tell me what information it passes or give me a sample URL (you can make it generic and mask your domain info)?
It sounds like some kind of tracking code, in which case it can definitely start to create duplicate content issues. You could probably use the rel=canonical tag to make Google "collapse" those pages, or you could tell Google to ignore the parameter in Google Webmaster Tools. Neither should impact your tracking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Hosted Wordpress Blog creating Duplicate Content
In my first report from SEOmoz, I see that there are a bunch of "duplicate content" errors that originate from our blog hosted on Wordpress. For example, it's showing that the following URLs all have duplicate content: http://blog.kultureshock.net/2012/11/20/the-secret-merger/ys/
Technical SEO | | TomHu
http://blog.kultureshock.net/2012/11/16/vendome-prize-website/gallery-7701/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/sm/
http://blog.kultureshock.net/2012/11/26/top-ten-tips-to-mastering-the-twitterverse/unknown/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/bv/ They all lead to the various images that have been used in various blog posts. But, I'm not sure why they are considered duplicate content because they have unique URLs and the title meta tag is unique for each one, too. But even so, I don't want these extraneous URLs cluttering up our search results, so, I'm removing all of the links that were automatically created when placing the images in the posts. But, once I do that, will these URLs eventually disappear, or continue to be there? Because our blog is hosted by Wordpress, I unfortunately can't add any of the SEO plugins I've read about, so, wondering how to fix this without special plugins. Thanks!
Tom0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
How to resolve duplicate content and title errors?
Hello, I'm new to this resource and SEO. I have taken the time to read other posts but am not entirely sure about the best way to resolve the issues I am experiencing and so am hoping for a helpful hand. My site diagnostics advise me that most of my errors relate to duplicate content and duplicate page titles. Most of these errors seem to relate to our ecommerce product pages. A little about us first, we manufacture and retail over the internet our own line of unique products which can only be purchased through our website. So it’s not so important to make our product pages stand out from competitors. An example of one of our product pages can be found here: http://www.nabru.co.uk/product/Sui+2X2+Corner+Sofa In terms of SEO we are focusing on improving the rankings of our category pages which compete much more with our competitors, but would also like our product pages to be found via a google search for those potential customers that are at the late stage of a buying cycle. So my question: Whilst it would be good to add more content to the product pages, user reviews, individual product descriptions etc (and have good intentions to do this over time, which unfortunately is limited) is there an easy way to fix the duplicate content issues, ensure our products can be found and ensure that the main focus is on our category pages? Many thanks.
Technical SEO | | jannkuzel0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0