Google URL Shortener- Should I use one or multiple???
-
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL).
Many of these links go to the same page ex .com/services-page
Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results?
Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this.
Thanks!
-
I agree with Eric, and I also think this may be a good use for UTM tracking URLs. You could easily set them up using, say, the video titles as your utm_content. You could then shorten the URLs with the UTM parameters. Google has a great URL builder tool here.
-
Keep in mind that a Google URL shortener is a 301 redirect to the URL. It would actually be better to use the full URL if possible, as it won't result in a 301 redirect. You typically lose some "link juice" when passed through a 301 redirect.
If you can't use the full URL, and you want to use the shortener, consider using one that will give you statistics (such as bit.ly). This way you can actually tell which video is sending traffic to your site and getting clicks. In that case, I would go with a unique shortener for each one.
-
I don't know SEO implications, but if you use a unique URL on each YouTube video it'll be easier to track which ones get more clicks, (assuming Google makes that info available from the URL shortener - I haven't used it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
1st Ecommerce site got penalized, can we start a 2nd one?
Hello, A client's first site got penalized by Goolge Penguin. It has recovered through cleaning up backlinks, but not to where it was before. It is 2nd and 3rd for several money keywords, but is far less successful than before penalization. We are starting a second site. Here's the important steps to mention The new site shows up first for it's domain name, and it has 30 pages indexed. It shows up NOWHERE for our leading search term. Out other site has a blog post that is 3rd for that search term. We are using new categories and new organization. We are using a different cart solution We are adding all unique content The home pages and some of the product pages are very thorough. We are adding comprehensive products like nothing else in the industry (10X) We plan on adding a very comprehensive blog, but haven't started yet. We've added the top 100 products so far. Our other store has 500. There's a lot of spam in the industry, so sites are slow to rank. Our category descriptions are 500 words Again, all unique content. No major errors in Moz Campaign tools Just a few categories so far, we're going to add many more. Same Google Analytics account as our other site It looks like we should eventually be on page 3 for our major search term. Again, we're nowhere for anything right now. ... Have you seen that Google will not rank a second site because it's from the same company and Google Analytics account, or does Google let you rank 2 sites in the same industry? We are hoping it's just slow to rank. If you can rank 2 sites, what are your best recommendations to help show up? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
Url suddenlly diappeared from Google search results
Hi, I am facing a big problem wheel Google stop showing a basic url of my site, It was ranked good for more than 35 keywords from 1st to 8st positions, and suddenly I can find it indexed in Google , this is the URL : http://tv1.alarab.com/view-8/مسلسلات-عربية Thnaks
White Hat / Black Hat SEO | | alarab.net0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
For example, lets say I have these 3 domains: product1.com product2.com product.com The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com) The purpose of this would be to capitalize on the Exact match domain opportunities. I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain. What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
White Hat / Black Hat SEO | | ClearVisionDesign0 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
Redirect n domain to one
What happen when I redirect301 10 domain to one? I have 10 domain with ave Page Authority=45 and Domain Authority 60 and want to increase my new domain by redirect them. is it right or wrong?
White Hat / Black Hat SEO | | vahidafshari450 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0