When to Use Schema vs. Facebook Open Graph?
-
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices.
We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity
Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
-
Serendipitous timing - this article was posted yesterday about using mark-up, and how Open Graph and Schema.org are used, and why to use both:
Facebook Open Graph serves its purpose well, but it doesn’t provide the detailed information search engines need to improve the user experience. A single web page may have many components, and it may talk about more than one thing. Even if you mark up your content for Facebook Open Graph, schema.org provides an additional way to provide more detail about particular entities on the page.
http://searchengineland.com/schema-org-7-things-for-seos-to-consider-post-hummingbird-172163
-
I personally would use both. They way that I look at it with the OG tags is that you are controlling the consistency of the brand across platforms that you do not officially support. This is very much in my mind the same thing as making a page display correctly in older version of IE.
-
OG and Schema can live in the wild together. They are both ways to show information around the entities which they describe.
IMDB is using both OG and Schema to mark up their data:
http://www.imdb.com/title/tt1392170/ -
Thanks, Craig. Do you know if any of the OG and schema tags would duplicate or conflict? I see a lot of documentation about using one or the other, but not how to use both harmoniously.
-
Thanks Keri, interesting example. While the GE Healthcare site is more commercial in intent, I like how they've treated the share functionality using the node icon. Subtle, yet shareable
-
I haven't checked in depth, the regulations are with the FDA and they aren't the most up-to-date with social media practices! No competitors are using OG yet, but their sites are also very under-optimized.
-
This may be way over-the-top, but have you checked if OG tags would violate the regulations at all, or if they could potentially be a violation down the road? Granted, even though I haven't read the regulations, I don't think it should...but it's just something I'd double-check. I could see a potential problem if the wording is ambiguous and a competitor wants to stir up trouble for you.
-
Given that other people may share those pages, I would incorporate both OG and Schema on the site.
-
Just because you can't share doesn't mean people aren't going to share it on FB. Just yesterday, I shared http://www3.gehealthcare.com/en/Products/Categories/Accessories_and_Supplies/Adventure_Series_for_CT/Pirate_Island on FB with my friends. I don't have formal experience in this area, but did want to point that out. There was an article on slate.com about the design of these, and I went looking for more information, and found that page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using CSS to hide anchor text
Hi all, In my website, I would like to use CSS to set the anchor text to "website design service"(my company provides web design service) but show the button text as "website", due to some artistic reasons. Therefore, the anchor text for the link is "website design service" but what users see is "websites". Does this sound spammy to Google? Is it a risky move that might hurt my SEO? Looking for some advises here. Thank you very much. Best,
Intermediate & Advanced SEO | | Raymondlee0 -
DA vs Relevancy - Trade Off Question
Hey Guys We all know that relevancy largely trumps DA nowadays. What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site... For example, sites with DA of 100 we probably want backlinks from. So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts... Obviously you would still be building relevant links too, developing content to do so and all that good stuff. I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links 🙂 Thanks
Intermediate & Advanced SEO | | GTAMP0 -
Robots.txt vs noindex
I recently started working on a site that has thousands of member pages that are currently robots.txt'd out. Most pages of the site have 1 to 6 links to these member pages, accumulating into what I regard as something of link juice cul-d-sac. The pages themselves have little to no unique content or other relevant search play and for other reasons still want them kept out of search. Wouldn't it be better to "noindex, follow" these pages and remove the robots.txt block from this url type? At least that way Google could crawl these pages and pass the link juice on to still other pages vs flushing it into a black hole. BTW, the site is currently dealing with a hit from Panda 4.0 last month. Thanks! Best... Darcy
Intermediate & Advanced SEO | | 945010 -
New Domain Vs. Existing Domain
Hello, A potential client of mine has been blacklisted because of bad SEO process basically they have over 1,500 toxic links on their site. They have penalised to such an extent that they are now on page 12 for most of their keywords and not ranking well on brand terms either. They are keen to on to a new domain entirely and ditch their current domain when we design their new site. I wanted to get people's opinion on whether this is the best course of action or should we try to salvage the current domain? Many thanks, Mat
Intermediate & Advanced SEO | | Barques-Design0 -
Subdomain blog vs. subfolder blog in 2013.
Having read this ( http://www.seomoz.org/q/blog-on-a-subdomain-vs-subfolder ) & countless of blog posts on never to put your blog on a domain because a subdomain is treated as a different site & your blog traffic won't help with your main sites authority. I've always pushed for subfolder blogs. However I've been seeing a lot of blogs now and days saying that Google is now treating subdomains as the same site as your main site. http://www.brafton.com/news/subdomains-vs-subdirectories-for-seo-no-serp-benefits-for-subdomains-anymore http://webmasters.stackexchange.com/questions/34173/subdomains-vs-subdirectory-status-as-of-2012/34366#34366 ETC... What does everyone think? Is it acceptable to have a blog in a subdomain in 2013? Thanks!
Intermediate & Advanced SEO | | DCochrane0 -
Does anyone have any experience using GoodRelations Snippet Generator for E-Commerce versus markup from Schema.org?
I am trying to implement Structured Data markup on a large e-commerce site that has ancient code and is not on any standard e-commerce platform. It is a Webstore we self-host that was developed for us and heavily customized. What's worse is that I don't have access to the source code. I have to somehow instruct our IT Director how and where to place everything. So I'm going to need to be meticulously specific. As I began wading through our code and determining where to insert code as instructed by Schema.org I ran across this on one of their pages: "This class contains derivatives of properties from the GoodRelations Vocabulary for E-Commerce, created by Martin Hepp. GoodRelations is a data model for sharing e-commerce data on the Web that can be expressed in a variety of syntaxes, including RDFa and HTML5 Microdata. More information about GoodRelations can be found at http://purl.org/goodrelations/." I went to check it out and it appears that this could be a great resource as it has a snippet generator and several "cookbooks" for adding micro data to our site. Here's a link to their snippet generator: http://www.ebusiness-unibw.org/tools/grsnippetgen/ However, with a catalog of 5,000 SKUs, needless to say we aren't going to plug in our products into this generator one-by one. Has anyone here successfully used GoodRelations to help them implement micro data into a large E-commerce site that isn't a standard platform (not Magento, WP, Joomla or Volusion) ?? I would be very greatful to anyone who can share their experiences and or make suggestions on how we might best proceed. Thanks! Dana
Intermediate & Advanced SEO | | danatanseo0 -
Links in Behind a Tab in Body vs Footer
Though I would ask the community this as it relates to positioning of external links on page vs. code and body vs footer. Working with a strategic partner for some data sharing arrangements and the question exists whether followed links in the template footer to our partners website provide better value vs. Body Links with context behind a CSS Tab (All code for Tabbed content is resolved on the same page)? Yes there are links coming back from the partner site as well.
Intermediate & Advanced SEO | | AU-SEO0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0