Google URL Shortener- Should I use one or multiple???
-
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL).
Many of these links go to the same page ex .com/services-page
Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results?
Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this.
Thanks!
-
I agree with Eric, and I also think this may be a good use for UTM tracking URLs. You could easily set them up using, say, the video titles as your utm_content. You could then shorten the URLs with the UTM parameters. Google has a great URL builder tool here.
-
Keep in mind that a Google URL shortener is a 301 redirect to the URL. It would actually be better to use the full URL if possible, as it won't result in a 301 redirect. You typically lose some "link juice" when passed through a 301 redirect.
If you can't use the full URL, and you want to use the shortener, consider using one that will give you statistics (such as bit.ly). This way you can actually tell which video is sending traffic to your site and getting clicks. In that case, I would go with a unique shortener for each one.
-
I don't know SEO implications, but if you use a unique URL on each YouTube video it'll be easier to track which ones get more clicks, (assuming Google makes that info available from the URL shortener - I haven't used it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Submitting url to link directories seen as un-natural link building?
Hi I have been a lurker for a long time, so I finally took the step to make my 1st post, and will hopefully start giving back more in the future since I have gained invaluable info from this great site Background I hired a new freelancer on our team of SEO consultants ("specialists") During the course a month he (the new consultant) submitted our website to numerous link directories (he assured me this is good), today I received the report of the work he had been doing for the past 4-weeks. I opened the report and I was furious and wanted to sack him there and then The Problem / My Question He had submitted our website to 150 directories with various levels of page rank, ranging from 7-1. Most of the directories are totally irrelevant to our niche (we are in the catering business) and he had gone and submitted the site to directories such as "finance busters", "questfinder" etc For all 150 submissions he used: exactly the same url exactly the same title exactly the same description exactly the same keywords My Concern Am I right to be worried about this? Or am I completely wrong and may this actually have an effect (even if none)? The way I see it is that Google is seeing 150 duplicate links coming from irrelevant directories all within a months time, which will trigger a red flag and possibly do major damage to my site, which has always been strictly white hat and been doing pretty well. p.s does link directory submissions even count these days anyway? Thanks for reading and advice very much welcome
White Hat / Black Hat SEO | | timthetanker0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Will Google perceive these as paid links? Thoughts?
Here's the challenge. I am doing some SEO triage work for a site which offers a legitimate business for sale listing service, which has a number of FOLLOWED link placements on news / newspaper sites - like this: http://www.spencercountyjournal.com/business-for-sale. (The "Business Broker" links & business search box are theirs.) The site has already been penalized heavily by Google, and just got pushed down again on May 8th, significantly (from what we see so far). Here's the question - is this the type of link that Google would perceive of as paid / passing page rank since it's followed vs. nofollowed? What would you advise if it were your site / client? From everything I've read, these backlinks, although perfectly legit, would likely be classified as paid / passing pagerank. But please tell me if I'm missing something. My advice has been to request that these links be nofollowed, but I am getting pretty strong resistance / lack of belief that these links in their current state (followed) could be harming them in any way. Would appreciate the input of the Moz community - if they won't believe me, and the majority here agrees about nofollowing, maybe they'll believe you. Thanks! BMT
White Hat / Black Hat SEO | | CliXelerate1 -
Impressions in Google SERP has declined from 3500 to 1600 after 5-25-2012\. Is it Penguin?
It's about the website http://www.apartments-houseboats-amsterdam.com/ The visitors had declined from 270 to 150 visitors per day. Is this caused by the Google update Penguin? If so what can I do to solve the problem? Thank you for your time and effort,
White Hat / Black Hat SEO | | letsbuilditnl0 -
Google Places
My client offers training from many locations within the UK. These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places. At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid. What are your thoughts.
White Hat / Black Hat SEO | | cottamg0