New AddThis URL Sharing
-
So, AddThis just added a cool feature that attempts to track when people share URL's via cutting and pasting the address from the browser.
It appears to do so by adding a URL fragment on the end of the URL, hoping that the person sharing will cut and paste the entire thing. That seems like a reasonable assumption to me.
Unless I misunderstand, it seems like it will add a fragment to every URL (since it's trying to track all of 'em). Probably not a huge issue for the search engines when they crawl, as they'll, hopefully, discard the fragment, or discard the JS that appends the fragment.
But what about backlinks? Natural backlinks that someone might post to say, their blog, by doing exactly what AddThis is attempting to track - cutting and pasting the link.
What are people's thoughts on what will happen when this occurs, and the search engines crawl that link, fragment included?
-
Thanks, Ryan.
-
I am not sure why you received the malware alert. Here is a direct link to the video on viddler: http://www.viddler.com/explore/jpozadzides/videos/2/
I can share that I used TYNT. Every page of my content had a hash tag on it and I never saw a search result with a hashtag. I never saw any indication in GWMT that my site used hashtags.
Matt clearly says "Google takes a URL and truncates at the hashmark. If you have bla-bla-bla #3 and bla-bla-bla #4 those both get treated or canonicalized as the same URL"
-
Seems like Rand concurred back in 2009:
http://www.seomoz.org/blog/whiteboard-friday-using-the-hash
Useful stuff. About halfway down the comments on the above link Rand mentions needing specific analytics code to track things accurately. Anyone have experience with Google Analytics and # symbols?
By the way, Ryan, that link you posted is being flagged by Avast as containing malware. No idea if it's real or not.
-
I was just watching a Matt Cutts video from 2007. Yes, I know that would be considered the dark ages of SEO but I believe for this topic, the video has relevancy.
@22 minutes in Matt says when Google encounters a hashtag in a URL they truncate it.
http://onemansblog.com/2007/08/04/matt-cutts-lecture-whitehat-seo-tips-for-bloggers/
-
The hash tags do not appear in the SERPs.
-
Hi Ryan,
Thanks for the response!
My interest isn't so much about visitors being able to follow the backlink or not, but how the SE's will index them. When a SE crawls a site with URL fragments, my experience has been that they do a good job discarding them.
What I'm seeing is two possibilities:
-
The SE's will discard the fragment when they crawl, and simply index the page as if it didn't have a fragment on the end, meaning a backlink with a fragment is identical to one without. Or,
-
They won't discard the fragment, and we'll end up with duplicates in the SERP's, which would, in part, be dealt with via a canonical tag.
It's great that you've used a similar service with TYNT.com Do you have any experience in how the SE's behave when crawling a link from TYNT and indexing that page?
Cheers.
-
-
This is nothing new to the web, just new to AddThis. TYNT.com offers this identical service. I have used them for some time but since I use AddThis for social sharing, it is more convenient for me to move this service to AddThis and eliminate one vendor.
The hashtag that is added to the end of URLs is there for tracking purposes. You can remove it or alter it, and you will still wind up on the exact same page. The hashtag has no effect on backlinks other then to track them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Preserving SERPs during a New SIte Launch
Hi there, Thank you so much for taking time out of your day to help. You people are stellar. When launching a new site with concern for preserving the site's organic placement, which attributes or data are the most important to keep consistent from the old site to the new one? For example, site structure, urls, meta data, image file names, and so on. Thanks again!
Intermediate & Advanced SEO | | leslieevarts0 -
Move to new domain with new design and url
I have an e-commerce website that is template based and I have absolutely no control over it. Each product have quite good ranking in google. However, we are creating new website using asp.net mvc and host in azure. It has totally new design. Since I have no control over my old website, I cannot force the server to redirect each product page to my new website product page. This is what I have done so far. I told my old website provider to point my domain (ex. domainA.com) to new nameserver at dyndns I created a new zone and add a http redirect service to new domain (http://www.domainB.com) with 301 redirect I'm pretty sure that this is not enough since there is a difference in url like this Old: www.domainA.com/product/70/my-product-name New: www.domainB.com/product/1/my-new-product-name New route config: {product}/{id}/{name} As you can see, the structure is similar but the product id and name is different. Do I need to catch the incoming id and name from old website and 301 redirect it again to the correct one? If so, this will cause double 301 redirect and would this be a SEO problem? Thank you in advance for your answer.
Intermediate & Advanced SEO | | as142208080 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Does having a trailing slash make a url different than the same url without the trailing slash?
Does having a trailing slash make a url different than the same url without the trailing slash? www.example.com/services Or www.example.com/services**/** Does Google consider these to be the same link or does Google treat them as different links?
Intermediate & Advanced SEO | | webestate0 -
SEO Overly-Dynamic URL Website with thousands of URLs
Hello, I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize. They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all. With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links. The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge. I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum Thought?
Intermediate & Advanced SEO | | MikePatch0 -
Acquisition of a new site in the same field.
Hello, I work with SEO for a company that just bought another in the same field. What is better to do? Just a 301 domain? Make 301 per page for a related page (more than 10,000 URLs, i'am afraid that this may be interpreted as blackhat ) or make crossdomain canonical tag urls related to (I believe this is not good, because the pages are not fully equal). thank's
Intermediate & Advanced SEO | | j0a0vargas0 -
How important is it to clarify URL parameters?
We have a long list of URL parameters in our Google Webmasters account. Currently, the majority are set to 'let googlebot decide.' How important is it to specify exactly what googlebot should do? Would you leave these to 'let googlebot decide' or would you specify how googlebot should treat each parameter?
Intermediate & Advanced SEO | | nicole.healthline0