Is there any SEO value to Infographs?
-
I purchased Piktochart to make what they said were SEO friendly infographs. Hearing conflicting responses on the SEO value I figure I should ask SEO's. The program is easy and you can download as an XML. Any responses are welcome Thank You
-
They can absolutely have a positive effect to support your SEO efforts. One thing I feel you should know is though, simply using a service like Piktochart and some of the other infographic creators and slapping together something quick isn't going to give you the results you are looking for.
I would recommend designing a custom infographic from scratch - or hiring a designer to do it for you after you provide him with a creative brief.
You also need to think of your plans on how you will distribute the infographic. Here is a great youmoz post on how to push an infographic - http://www.seomoz.org/ugc/how-to-push-an-infographic
-
Infographics can be awesome vehicles for driving inbound links, thought leadership and social media activity. You do have to promote it, though. That means proactive outreach to industry sites, consistent push through your existing social media channels, etc.
The landing page you host it on should also have a substantial amount of written content. You also want to make it easy as possible for someone to share it - so providing a copy of the code needed to republish/share right on the page is a good idea.
Good luck!
-
Hi,
First, thanks for referring this site. I will dig into this further. I know the good info graphs utilize html and css, so you get the benefit of your information indexing. I'll look into this product and provide feedback. Thanks again for pointing out this site. I'll research it soon.
Thanks,
Andrew
-
Simply creating the infographics won't create value for you. You have to go out and promote them to build social links e.g. tweets, shares, comments and links (when people share them).
Does this help?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you optimize a blog post for SEO after the RankBrain?
Hi Guys Just curious to hear what you guys do to rank blog posts in the top in Google especially onsite, after the RankBrain update? Do you still use SEO tools to optimize this or are the SEO tools outdated for this? If yes which tools do you use to get success with? Cheers John
Algorithm Updates | | igniterman751 -
38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
Hi all, Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed. Thanks Screenshot_3.jpg
Algorithm Updates | | vtmoz1 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Infographics Links could get discounted in the future
Hey guys, I read this article this morning on SEL. Not sure what to think about it.. Matt did have a point that a lot of infographics are of bad quality (even with wrong information present at times) , and hence don't deserve to gain links from it. But how could Google possible know whether the infographic itself is of high quality or not?? http://searchengineland.com/cutts-infographic-links-might-get-discounted-in-the-future-127192
Algorithm Updates | | Michael-Goode0 -
Multiple Listings in Results fading Local SEO
Lately I am noticing multiple listings for results seem to be fading away. Example is one domain being listed twice for a search phrase The Home page for example and an Internal Page. Is anyone else seeing this? Safe to say Google wants to see 10+ individual domains per results page?
Algorithm Updates | | bozzie3110 -
Confused About Addon Domains and SEO
I find addon domains really confusing. Everyone I've asked so far says that they don't affect SEO but I find that really hard to believe considering the same content is on both a subdomain and a subfolder and also has it's own unique domain. PLUS (in my case) completely different niche sites are sharing the same hosting. I really don't want to pay for hosting for all of my different sites but at the same time, if it's better/safer to do so for Panda/Penguin reasons I'm happy to do that. Thank you for your time. I look forward to your opinions/suggestions!
Algorithm Updates | | annasusmiles0 -
SEO Faith Shaker... help!!
Something has happened which is, well inexplicable to me... I'm stumped! We have a client that has two sites which compete for the same keywords. One is a .com, the other is a .co.uk. They have different content so there's no dupe worries. We have, for the past few months been carrying out SEO for the .com site. It's doing great. We don't do anything with the .co.uk site, which, incidentally dropped from 2nd (under the .com) to 9th after Panda for its main keyword. The owner of the site has switched the .co.uk to Wordpress and now that site, with the same content, same links, same social signals, etc... (nothing was done to it except the platform being changed) has suddenly shot up above the .com for not only its main keyword but most of the others too. What gives?? It doesn't even have a link from the .com site! So, the .com which has undergone SEO is now being beaten by the .co.uk which hasn't. The .com is still directly underneath it. It feels like all of the things we know about SEO, all of the ranking factors and everything are being totally undermined here, just due to a change to Wordpress. Surely that can't be it?? The .com is an older domain, has more content, has always done well, has more links and from better places, and all the social stuff surrounding the business is targeted at it. This isn't a penalization issue or anything like that, this is simply a matter of the .co.uk suddenly blasting above everything for no apparent reason. Any ideas?? I know that there "might" be a tiny, tiny, tiny advantage of the country TLD but that's not enough to do this, and the .co.uk always did worse before.
Algorithm Updates | | SteveOllington1