Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you mark up a page using Schema.org and Facebook Open Graph?
-
Is it possible to use both Schema.org and Facebook Open Graph for structured data markup?
On the Google Webmaster Central blog, they say, "you should avoid mixing the formats together on the same web page, as this can confuse our parsers."
Source - http://googlewebmastercentral.blogspot.com/2011/06/introducing-schemaorg-search-engines.html
-
Here's a good place to start: https://developers.facebook.com/docs/concepts/opengraph/.
-
Can anyone direct me to a source on how to apply the Facebook Open Graph markups?
Thanks!
-
That's helpful. Thank you, Dan.
-
I was under the impression that Open Graph data is completely separate from structured data, at least in the way Google is talking about.
There are quite a few examples of websites using both without any issue, for example:
http://www.telegraph.co.uk/culture/film/filmreviews/9257722/Dark-Shadows-review.html
This review features extensive Open Graph data, in addition to plenty of additional markup data, and this does not affect the rich snippets data which can be found by searching under the keywords "dark shadows review".
-
I also just found this comment on a GWT support page (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1093493

"You can use microformats, microdata, or RDFa to mark up your content. However, you should pick one markup standard and use it consistently across the page."
I'm concerned using both will affect our ability to get--and keep--rich snippets in Google.
-
I too would like to know this, because I've been doing it for months now. I haven't noticed any rankings issues, mind you, so perhaps it isn't really a big deal?
Providing the data is the same, I can't see it hurting. I will be following this topic with much interest.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema.org product offer with a price range, or multiple offers with single prices?
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6). In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set. Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
Technical SEO | | 4RS_John1 -
Can you force Google to use meta description?
Is it possible to force Google to use only the Meta description put in place for a page and not gather additional text from the page?
Technical SEO | | A_Q0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Page titles in browser not matching WP page title
I have an issue with a few page titles not matching the title I have In WordPress. I have 2 pages, blog & creative gallery, that show the homepage title, which is causing duplicate title errors. This has been going on for 5 weeks, so its not an a crawl issue. Any ideas what could cause this? To clarify, I have the page title set in WP, and I checked "Disable PSP title format on this page/post:"...but this page is still showing the homepage title. Is there an additional title setting for a page in WP?
Technical SEO | | Branden_S0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
Technical SEO | | Archers0 -
How Can I Block Archive Pages in Blogger when I am not using classic/default template
Hi, I am trying to block all the archive pages of my blog as Google is indexing them. This could lead to duplicate content issue. I am not using default blogger theme or classic theme and therefore, I cannot use this code therein: Please suggest me how I can instruct Google not to index archive pages of my blog? Looking for quick response.
Technical SEO | | SoftzSolutions0