Google URL Shortener- Should I use one or multiple???
-
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL).
Many of these links go to the same page ex .com/services-page
Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results?
Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this.
Thanks!
-
I agree with Eric, and I also think this may be a good use for UTM tracking URLs. You could easily set them up using, say, the video titles as your utm_content. You could then shorten the URLs with the UTM parameters. Google has a great URL builder tool here.
-
Keep in mind that a Google URL shortener is a 301 redirect to the URL. It would actually be better to use the full URL if possible, as it won't result in a 301 redirect. You typically lose some "link juice" when passed through a 301 redirect.
If you can't use the full URL, and you want to use the shortener, consider using one that will give you statistics (such as bit.ly). This way you can actually tell which video is sending traffic to your site and getting clicks. In that case, I would go with a unique shortener for each one.
-
I don't know SEO implications, but if you use a unique URL on each YouTube video it'll be easier to track which ones get more clicks, (assuming Google makes that info available from the URL shortener - I haven't used it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
What penalty would cause this traffic drop (Google Analytic Screenshot)
This ecommerce site was hit (mostly) slowly by updates but there is nothing in GWT. Below is the graph. Keep in mind that most of our traffic is return customers, so the drops don't look dramatic, but they are. "New Visitors" doesn't show the drop. This is a "Daily" Google Analytics setting. The drop I've circled is May 23-May 24, 2013. It was a huge hit in non-return customers. This graph is "Unique Visitors" I don't know why the "New Visitors" graph is not showing the dip Although we had some big drops, a lot of the drop was gradual. Any help in identifying what could be causing the problem is appreciated. ga.png
White Hat / Black Hat SEO | | BobGW0 -
Is there a problem with google?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates). Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
White Hat / Black Hat SEO | | BobAnderson0 -
SEOLutions - Paint it White... Has any one used?
Has anyone used the tiered link building service offered by seolutions (http://seolutions.biz/store/seo-solutions/premium-solutions-paint-it-white.html)? If so, can you provide any insight into how effective it was in the long and short term? Thanks!
White Hat / Black Hat SEO | | PeterAlexLeigh0 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
Redirecting an image url to a more SEO friendly image url
We are currently trying to find the best way of making the images on one of our sites more SEO friendly, the easiest way for us would be to redirect the image URL to a more SEO friendly image URL. For example: http://www.website.com/default/cache/file/F8325DA-0A9A-437F-B5D0A4255A066261_medium.jpg redirects to http://www.website.com/default/cache/file/spiral-staircase.jpg Would Google frown upon this as it's saying the image is one thing and then points the user somewhere else?
White Hat / Black Hat SEO | | RedAntSolutions0 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1