When to Use Schema vs. Facebook Open Graph?
-
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices.
We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity
Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
-
Serendipitous timing - this article was posted yesterday about using mark-up, and how Open Graph and Schema.org are used, and why to use both:
Facebook Open Graph serves its purpose well, but it doesn’t provide the detailed information search engines need to improve the user experience. A single web page may have many components, and it may talk about more than one thing. Even if you mark up your content for Facebook Open Graph, schema.org provides an additional way to provide more detail about particular entities on the page.
http://searchengineland.com/schema-org-7-things-for-seos-to-consider-post-hummingbird-172163
-
I personally would use both. They way that I look at it with the OG tags is that you are controlling the consistency of the brand across platforms that you do not officially support. This is very much in my mind the same thing as making a page display correctly in older version of IE.
-
OG and Schema can live in the wild together. They are both ways to show information around the entities which they describe.
IMDB is using both OG and Schema to mark up their data:
http://www.imdb.com/title/tt1392170/ -
Thanks, Craig. Do you know if any of the OG and schema tags would duplicate or conflict? I see a lot of documentation about using one or the other, but not how to use both harmoniously.
-
Thanks Keri, interesting example. While the GE Healthcare site is more commercial in intent, I like how they've treated the share functionality using the node icon. Subtle, yet shareable
-
I haven't checked in depth, the regulations are with the FDA and they aren't the most up-to-date with social media practices! No competitors are using OG yet, but their sites are also very under-optimized.
-
This may be way over-the-top, but have you checked if OG tags would violate the regulations at all, or if they could potentially be a violation down the road? Granted, even though I haven't read the regulations, I don't think it should...but it's just something I'd double-check. I could see a potential problem if the wording is ambiguous and a competitor wants to stir up trouble for you.
-
Given that other people may share those pages, I would incorporate both OG and Schema on the site.
-
Just because you can't share doesn't mean people aren't going to share it on FB. Just yesterday, I shared http://www3.gehealthcare.com/en/Products/Categories/Accessories_and_Supplies/Adventure_Series_for_CT/Pirate_Island on FB with my friends. I don't have formal experience in this area, but did want to point that out. There was an article on slate.com about the design of these, and I went looking for more information, and found that page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Single Folder vs Root
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks. lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer I should note this site will have over a dozen city locations, with different practices.
Intermediate & Advanced SEO | | EdShull0 -
Wrong redirect used
Hi Folks,
Intermediate & Advanced SEO | | Patrick_556
I have a query & looking for some opinions. Our site migrated to https://
Somewhere along the line between the developer & hosting provided 302 redirect was implemented instead of the recommended 301 (the 301 rule was not being honured in the htaccess file.)
1 week passed, I noticed some of our key phrases disappear from the serps 😞 When investigated, I noticed this the incorrect redirect was implemented. The correct 301 redirect has now been implemented & functioning correctly. I have created a new https property in webmaster tools, Submitted the sitemap, Provided link in the robots.txt file to the https sitemap Canonical tags set to correct https. My gut feeling is that Google will take some time to realise the problem & take some time to update the search results we lost. Has anyone experienced this before or have any further thoughts on how to rectify asap.0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Facebook page optimization
I'm working with a client who is "under attack" by one unhappy customer. That customer created a Facebook page to share her outrage, and her page is outranking my client's (consistently immediately above his FB page). I've checked all of the obvious things... page name page URL About section, and all business-related data He has MANY more "Likes" than she does, makes posts far more frequently (with much better Engagement), references his company name in almost every Post (as she does), and on and on. My main question is this... are there one or two factors that seem to have the most impact on how a given FB page ranks? Thanks for your help, Moz family! 🙂
Intermediate & Advanced SEO | | measurableROI0 -
Pricing Page vs. No Pricing Page
There are many SEO sites out there that have an SEO Pricing page, IMO this is BS. A SEO company cannot give every person the same quote for diffirent keywords. However, this is something we are currently debating. I don't want a pricing page, because it's a page full of lies. My coworker thinks it is a good idea, and that users look for a pricing page. Suggestions? If I had to build one (which I am debating against) is it better to just explain why pricing can be tricky? or to BS them like most sites do?
Intermediate & Advanced SEO | | SEODinosaur0 -
Is This 301 Use Best Practice??
I know its effective practice cuz we're getting our arse kicked. I'm curious if its best practice (white, gray or black hat). I'm checking a competitors link profile on its landing page that is hitting the top of page 1 for several keywords. This competitor (national chain) has a strong domain authority (69). The particular landing page I'm checking in OSE has two 301 redirects from its own site among some other directory links to the page. The page shows 15 external links and half of them are very strong including it's own 301's. Aren't they essentially sending their own juice to the landing page to bolster page/domain authority to rank higher in the SERPS for those keywords? Is this a common practice using the 301's to a landing page? Is it white, gray or black hat? They are appearing suddenly appearing on the first page for several category keywords, so we're doing some snooping. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Not using a robot command meta tag
Hi SEOmoz peeps. Was doing some research on robot commands and found a couple major sites that are not using them. If you check out the code for these: http://www.amazon.com http://www.zappos.com http://www.zappos.com/product/7787787/color/92100 http://www.altrec.com/ You fill not find a meta robot command line. Of course you need the line for any noindex, nofollow, noarchive pages. However for pages you want crawled and indexed, is there any benefit for not having the line at all? Thanks!
Intermediate & Advanced SEO | | STPseo0 -
Disabled/Accessibilty vs SEO?
Can anyone point me to resources that helps website owners balance these two issues? Or how to SEO a site meant for disabled users? or how to make an SEO'd site more accessible? Thanks!
Intermediate & Advanced SEO | | mjcarrjr0