Markup reference data using Scheme.org
-
Can anyone point me to a page showing how to mark up reference data according to schema.org ? Ie glossary or dictionary page.
-
You're right, my mistake.
I'm not to sure that Schema.org has something for that type of data? Here's the full list of their supported item types: http://schema.org/docs/full.html
I'm not too sure where reference data would fit in? Maybe under Thing > Intangible
-
Near as I can tell, reference material has not yet been addressed.
This means I have to extend the schema myself as outlined here: http://www.schema.org/docs/extension.html
Not 100% sure how to go about that in a way that will actually register properly. Good chance I will get it wrong.
-
Thanks but that's not that related to schema.org
-
Take a look at this tutorial: http://www.w3schools.com/schema/
-
Here is a good place to start
-
click on one of the schemas and they have examples.
-
Sorry, but that is literally no help. I've read the site a fair bit - a link to it's sitemap doesn't get me any closer.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use Internal Search pages as Landing Pages?
Hi all Just a general discussion question about Internal Search pages and using them for SEO. I've been looking to "noindexing / follow" them, but a lot of the Search pages are actually driving significant traffic & revenue. I've over 9,000 search pages indexed that I was going to remove, but after reading this article (https://www.oncrawl.com/technical-seo/seo-internal-search-results/) I was wondering if any of you guys have had success using these pages for SEO, like with using auto-generated content. Or any success stories about using the "noindexing / follow"" too. Thanks!
Technical SEO | | Frankie-BTDublin0 -
Issues with structured data on angular pages.
I am having an issue with Structured Data. I have added the structured data to angular pages of my site but when I run the test from the testing tool it doesn't detect the same. Although when I cut and paste the code (from inspect element) it detects the structured data. But in my webmaster tools, those pages don't show up under structured data. I am unsure if my structured data is being picked up by google. What should be done here? Should I provide pre-rendered pages to google?
Technical SEO | | Lybrate06060 -
Should Sitemaps be placed in the sub folder they reference?
I have a sitemap-index.xml file in the root. I then have several sitemaps linked to from the index in example.com/sitemaps/sitemap1.xml, example.com/sitemaps/sitemap2.xml, etc. I have seen on other sites that for example a sitemap containing blogs where the blogs are located at example.com/blog/blog1/ would be located at example.com/blog/sitemap.xml. Is it necessary to have the sitemap located in the same folder like this? I would like to have all sitemaps in a single sitemap folder for convenience but not if it will confuse search engines. My index count for URLs in some sitemaps has dropped dramatically in Google Webmaster Tools over the past month or so and I'm not sure if this is having an effect. If it matters, I have all sitemap files, including the index, listed in the robots.txt file.
Technical SEO | | Giovatto0 -
Author & Video Markup on the Same Page
I just have a quick question about using schema.org markup. Is there any situation where you'd want to include both author & video markup on the same page?
Technical SEO | | justinnerd0 -
Opengraph or Schema.org - What is your preference?
Just wondering whether Mozzer's have a preference when marking up rich snippets. I have just added Opengraph as per a Whiteboard Friday presentation suggesting that Google will use the meta data as well as schema, mainly for purposes of socially sharing the website details, and I have also used Schema for marking up an address, telephone number, and email. Is this the right way to do things? What are your thoughts?
Technical SEO | | AspectExhibitions0 -
Do i need to use proxy when i ping my backlinks?
I just create 50+ backlinks i would like to know when i ping those like do i need to use proxy? Thank you so much
Technical SEO | | locoto00071 -
Domain taken. Which is better? Using hypens or longer domain.
I am wanting to set up an e commerce site and the domain name that I want is taken. I am considering using a domain that has the main keyword I want to rank for as the domain. I have heard chatter of google penalizing these types of sites and it seems that it hasn't come about. This is something that I would like to test out. So if "electricscooters.com" is taken, should I use "electric-scooters.com" or "electricscooters4less.com" Just wondering if the hyphenated or the longer domain will rank higher. The site won't be spammy at all, I will carry a few different companies that offer similar products. So for this case, I would only sell scooters from a few different manufacturers. Feedback would be appreciated!
Technical SEO | | Dave_Whitty0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0