Best Practices for Getting Shares, Likes, Etc on Social Media?
-
I wanted to know what the best practices are for getting people to re-tweet, like, share, and google plus your website.
What has worked best for all of you, and what has worked not so well?
Thanks!
-
Keith,
Thanks for your quick response.
Do you recommend share buttons or like buttons that have a count next to them on the website, rather than the ones that do not?
-
Agreed on timing. Here's a trendy little infographic we did that might be helpful: http://www.fannit.com/social-media-infographic-when-are-the-best-times-to-post/
-
Yes, we use the sharebar for our wordpress sites (you can find HTML code as well)! It's a must have. I'm sorry I forgot to mention that in my original answer but you really should have some sort of share function that includes some or all of the following:
Google Plus
Like
Tweet
Pin
Share On Facebook
Stumble
RedditLook at the 2013 factors for ranking that MOZ just put out. The social shares have a big impact on rankings now..especially Google Plus http://moz.com/blog/future-of-search-ranking-factors
-
So here's part two of my question.
I work mainly for auto dealerships and we do have buttons on them that have the facebook logo, the twitter logo, etc.
I've noticed that by contrast there are buttons on sites with a share, or retweet count on them. Do those serve as better calls to action to encourage people to share or like, etc?
Thanks!
-
- Create original content. Have some unique perspective. There is no substitute. Original is important than "amazing" as what is "amazing" is different to different audience. And if more people are sharing your content who cares if that was amazing to you or your company.
- Timing: One of single most overlooked factor. If you try creating/seeding content different time, different days you will see which days/time work for you. To run a test we shared same content, same network but at different times., The difference can be huge. We have seen swings of 1200% based on time.
- It gets better as you get some traction. social proof networking effects clearly helps as you get more shares.
- Don't try to create content for the planet. Identify a niche audience initially and target them
-
Promoting the heck out of your posts! : ) Here's a little insight into our social promotion. Basically, we took our website from about 300 hits per month to over 4000 in a 30 day period by doing social promotion with the following method:
1. Create an amazing article (curated content or developing your own) - Before you just pick anything off hand go check the social spaces to see what your target market is sharing on social spaces. Now you know what people like!
2. Develop an infographic on the article as well - You'll be able to promote your infographic on sites like visual.ly which may get featured to their user base. Not only that, visual,ly ranks incredibly well for content in the SERPS. Visuals really help people engage better with your content. You'll hit people who would rather look at a visual as well as the people who who may want to dig a little deeper into your article.
<address>3. Social Promotion - Ok this is a big list of promotion tactics that we use.... and it works really well (if you have good content). This is a link to my guide http://www.fannit.com/smm/social-media-marketing/. Basically, it's a checklist of things you can do to automate the process a bit. I'd stick it on here but... it's really big.</address>
<address> </address>
<address>I hope that helps! </address>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When a company lets you go, what is the standard practice for a moz account? Do you simply turn it over and lose all access and moz identity? Or?
Since MOZ is a community, it seems odd to simply turn over my moz account now the company has let me go. Do people typically do this and simply start a new account every time they work someplace new? Or what? I wanted to seek out what the community thought about this -- so that I would have a better idea of how to handle this now and for the next time.
Industry News | | CCFilson0 -
Has anybody used Yext or Universal Business Listings as an automated approach to getting clients into all of the many directories? If so does it work? Or does Google penalize in using these automated services?
I'm trying to figure out if using either Yext or Universal Business Listings is worth it. They have reseller programs for SEO agencies. I just am curious what other SEO folks think of these services as I'm considering using one of them to automate and save time for clients. If you go to Yext.com or universalbusinesslistings.org you can see these. Curious what others say about these. Thanks
Industry News | | SOM240 -
Get Google To Crawl More Pages Faster on my Site
We opened our database of about 10 million businesses to be crawled by Google. Since Wednesday, Google has crawled and indexed about 2,000 pages. Google is crawling us at about 1,000 pages a day now. We need to substantially increase this amount. Is it possible to get Google to crawl our sites at a quicker rate?
Industry News | | Intergen0 -
How elated do you get?
So I was wondering, as from time to time I get a tiny bit excited if I manage to gain a new link that I've tried hard to get or I manage to outrank that **** that has been stuck at number 1 for so long or if I managed to do some A/B testing which has led to more leads and conversions. Does anybody else cheer inside or even have a short outburst of YEAAAH! f*** you Wikipedia! when they do something worthwhile? I may be on my own here..
Industry News | | Hughescov0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
What is the best site-map software?
I am in the beginning stages of building a website and would like to know the site-mapping software for organizing my company's ideas. Any recommendations are greatly appreciated. Thanks!
Industry News | | canadianplum0