Is tracking code added to the end of a URL considered duplicate content
-
I have two URLs one with a tracking coded and one without.
http://www.towermarketing.net/lets-talk-ux-baby and
http://www.towermarketing.net/lets-talk-ux-baby/**#.U6ghgLEz64I **
My question is will this be considered as two separate URLs, will Google consider this as two pages with duplicate content. Any recommendations would be much appreciated.
-
Good advice. Thanks
-
Very much appreciated
-
Most people use these pages for tracking only, and set them to no-index so Google doesnt pick them up. I'm guessing you have tracking set up to analyze the traffic and attachement rate for that article? A hashtag is genearally used to point to an anchor point on that page. You currently have two URLs with the same content.
As Oleg said, I would be sure to add a canonical link for that page.
-
Agree totally with what you said. One thing I would like to add, and this is just my assumption, the #. tracking system is unique to addthis. I would be willing to be some how Google keeps track of these unique tracking patterns and disregards them.
-
It shouldn't (its a hashtag anchor, not a new variable) but add a canonical url to the page anyways to avoid all duplicate content issues
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Keywords
I am doing SEO on our eCommerce website and read that I should include keywords in the URL The original URL is: http://thegiftlinks.com/personalized-wedding-glass.html
On-Page Optimization | | abdulw
Title page: Wedding gift Dubai - Anniversary gift Dubai - Personalized Wedding Glass
Meta Data:
Wedding gift Dubai - Anniversary gift Dubai - Personalized Wedding Glass
It is great for a wedding gift and anniversary gift for friends and family members. If I will include the keyword to the url it will be like this
http://thegiftlinks.com/personalized-wedding-glass.html/Wedding-gift-Dubai is this the correct way to include keywords in the URL? Thanks0 -
Duplicate Content, Same Company?
Hello Moz Community, I am doing work for a company and they have multiple locations. For example, examplenewyork.com, examplesanfrancisco.com, etc. They also have the same content on certain pages within each website. For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)? I hope this is clear. Thanks, Cole
On-Page Optimization | | ColeLusby0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
E commerce Website canonical and duplicate content isssue
i have a ecomerce site , i am just wondering if any one could help me answer this the more info page can be access will google consider it as duplicate and if it does then how to best use the canonical tag http://domain.com/product-page http://domain.com/product-page/ http://domain.com/product-Page http://domain.com/product-Page/ also in zencart when link product it create duplicate page content how to tackle it? many thanks
On-Page Optimization | | conversiontactics0 -
Checking Duplicate Content
Hi there, We are migrating to a new website, which we are writing lots of new content for the new website. The new website is hosted on a development site which is password protected and so on so that it cannot be indexed. What i would like to know is, how do i check for duplicate content issues out there on the world wide web with the dev site being password protected? Hope this makes sense. Kind Regards,
On-Page Optimization | | Paul780 -
Duplicate content on homepage?
Hi I have just created a new campaign and it states that I have duplicate page content which would affect search rankings. Basically it is counting my site www.mydomain.com and www.mydomain.com/index.php as two seperate pages. How can I make it so that only www.mydomain.com is visible reducing the duplicate content issue? Many Thanks
On-Page Optimization | | idv0 -
Quick and easy Joomla 1.5 Duplicate content fix?
www.massduitrialalwyers.com has a TON of duplicate content based on the way joomla 1.5 uses articles. Do you have a tried and true method to eliminate (automated would be preferred) the issues>? if not, might you suggest a plug in that takes care of the rel canonical?
On-Page Optimization | | Gaveltek-173238
Cheers0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0