Google URL Shortener- Should I use one or multiple???
-
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL).
Many of these links go to the same page ex .com/services-page
Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results?
Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this.
Thanks!
-
I agree with Eric, and I also think this may be a good use for UTM tracking URLs. You could easily set them up using, say, the video titles as your utm_content. You could then shorten the URLs with the UTM parameters. Google has a great URL builder tool here.
-
Keep in mind that a Google URL shortener is a 301 redirect to the URL. It would actually be better to use the full URL if possible, as it won't result in a 301 redirect. You typically lose some "link juice" when passed through a 301 redirect.
If you can't use the full URL, and you want to use the shortener, consider using one that will give you statistics (such as bit.ly). This way you can actually tell which video is sending traffic to your site and getting clicks. In that case, I would go with a unique shortener for each one.
-
I don't know SEO implications, but if you use a unique URL on each YouTube video it'll be easier to track which ones get more clicks, (assuming Google makes that info available from the URL shortener - I haven't used it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
How Important is it to Use Keywords in the URL
I wanted to know how important this measure is on rankings. For example if I have pages named "chair.html" or "sofa.html" and I wanted to rank for the term seagrass chair or rattan sofa.. Should I start creating new pages with the targeted keywords "seagrass-chair.html" and just copy everything from the old page to the new and setup the 301 redirects?? Will this hurt my SEO rankings in the short term? I have over 40 pages I would have to rename and redirect if doing so would really help in the long run. Appreciate your input.
White Hat / Black Hat SEO | | wickerparadise0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
How does google view...
I have two urls that are almost the same for example: www.mysite.co.uk/motoring/car_fuel www.mysite.co.uk/motoring/car-fuel both pages are very different, but on the same topic. How does google view the use of _ and - in urls? Will it see my urls as different? Please advise if you know the answer. Thank You.
White Hat / Black Hat SEO | | JamesT0 -
SEOLutions - Paint it White... Has any one used?
Has anyone used the tiered link building service offered by seolutions (http://seolutions.biz/store/seo-solutions/premium-solutions-paint-it-white.html)? If so, can you provide any insight into how effective it was in the long and short term? Thanks!
White Hat / Black Hat SEO | | PeterAlexLeigh0 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
Banned from google !
Hello, I realize (with GAnaltytics and command "link:") this morning that my domain host (share one) : "mlconseil.com" under which several websites are hosted has been banned from google. Here below the websites : www.amvo.fr :
White Hat / Black Hat SEO | | mozllo
www.apei-cpm.fr :
www.armagnac-les-vieux-chenes.fr
www.centraledelexpertise.fr
www.cleaning-pc-33.com
www.internet-33.fr
www.territoires-et-ntic.fr
www.vin-le-taillou.com
www.maliflo.asso.fr I don't kow why, i use since end of january 2011 IBP, only for some submissions to directories and for managing some lists of urls. I submitted about 30/40 directories never at the same time , but raher day after day, smoothly. On www.territoires-et-ntic.fr and www.amvo.fr which are blogs, i have installed some external rss feeds to display as articles, i decided to stop that but i don't know if it's related to such "blacklistage" from google. I don't use any nasty "blackhat" programs or else.. I'am really upset about that, i claim this morning with the same words as now, a new indexation but i don't know how long it will take ?Any idea ? Which are the tools which could help me to scan for maybe any malicious maleware on my hosting provider ? Many tks0