Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Event Schema markup for multiple events (same location/address)?
-
I was wondering if its possible to markup multiple events on the same page for one location/address using the event schema.org markup? I tried doing it on a sample page below:
http://www.rama.id.au/event-schema-test/
Google's schema testing tool shows that its all good (except for warning for offers). Just wanted to know if I am doing it correctly or is there a better solution. Any help would be much appreciated.
Thank you
-
Webmaster tools / search console https://www.google.com/webmasters/tools/home?hl=en
https://search.google.com/structured-data/testing-tool/u/0/
You can use this as a template
Or use this great tool https://jsonld.com/json-ld-generator/
(remember with JSON-LD you must have the content in HTML if you post via JSON-LD)
-
-
Hi Vincent
Again, it may be a matter of creating individual pages for each event w proper Schema, or it is possible to use tags in Schema as well. So, each event would be wrapped by their proper Event tag, with the address information in a meta tag included. Both are rather tedious. You can read more here:
https://schema.org/docs/gs.html#schemaorg_testing
Ideally though, you'd have an individual page for each event.
You could follow Ticketmaster's path and use data-vocabulary.com markup, however, Schema is the standard. If you're wondering what I mean, run the following URL through the Google Structure Data Markup Tester I linked to in my previous comment:
http://www.ticketmaster.com/Chicago-Bulls-tickets/artist/805914
Sorry for not posting links - I am on my phone and I cannot. Will update in the AM.
Hope these help! Good luck!
-
Hello Oleg and Patrick
Thank you so much gentlemen for helping me out. Unfortunately, I cannot wrap each event in it's own using itemscope itemtype="http://schema.org/Event" as each event will then require me to specify the same address multiple times due to the "location" attribute being a required field. Since the address occurs only once on the pageI am bound to use it only once by tying the same address to multiple address. On a side note, how come the Google schema testing tool able to pass my implementation on the sample URL?
Hope to hear from you soon.
Thanks once again.
-
I agree with Oleg here - each event should have it's own page.
That being said, it is possible to markup individual events on the same page, because each event is has it's own unique attributes. Each event will be wrapped in it's own itemscope itemtype="http://schema.org/Event" - so be mindful of that.
You can read more here.
Keep in mind Google and Yandex have structured data markup testing tools.
Hope this helps! Good luck!
-
You should create separate events for each event you have (even if the location is the same)
from https://schema.org/Event --> "Repeated events may be structured as separate Event objects."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Size of image for article Schema
Hi, I implemented schema markup for an article and all tested fine and I can see it being fired in preview mode of Google Tag Manager. But when I run the URL which has it applied through Google Structured Testing tool it is not appearing. I have now read that the image needs to be a certain size. For AMP articles this appears to be 12oo pixels wide http://www.thesempost.com/google-changes-image-size-requirements-amp-articles/ But what about non-AMP articles? Does it need to be that big too?
Technical SEO | | AL123al0 -
Can Google Read schema.org markup within Ajax?
Hi All, as a local business directory, we also display Openinghours on a business listing page. ex. http://www.goudengids.be/napoli-kontich-2550/
Technical SEO | | TruvoDirectories
At the same time I also have schema.org markup for Openinghours implemented.
But, for technical reasons (performance), the openinghours (and the markup alongside) are displayed using AJAX. I'm wondering if google is able to read the markup. The rich snippet tool and markup plugings like Semantic Inspector can't "see" the markup for openinghours. Any advice here?0 -
Vanity / Short URLs 301?
Hi everyone, I'm working on a website that uses a lot of short urls eg http://www.forest.com/oaktrees. A quick check reveals these are currently 302 status. My question is should these be made 301s - a lot of them are in off-page content and looking at GA attract a lot of clicks. I've not managed to see a definitive answer to this after several Google searches. All help and advice greatly appreciated. Bw Jon
Technical SEO | | CoL-PR0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
How does Google find /feed/ at the end of all pages on my site?
Hi! In Google Webmaster Tools I find *.../feed/ as a 404 page in crawl errors. The problem is that none of these pages exist and they have no inbound links (except the start page). FYI, it´s a wordpress site. Example: www.mysite.com/subpage1/feed/ www.mysite.com/subpage2/feed/ www.mysite.com/subpage3/feed/ etc Does Google search for /feed/ by default or why do I keep getting these 404´s every day?
Technical SEO | | Vivamedia0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
The effect of same IP addresses on SERPs
Hi All, Just wondering if anyone could shed some light on the following. If I was ranking number 1 for a term, what would the effects be of creating another site, hosted on the same server / IP, same whois info, same URL but a different TLD, and trying to get this to rank for the term also. Does G restrict search results to one IP per page or is this perfectly possible? (The term is fairly uncompetitive) Thanks, Ben
Technical SEO | | Audiohype0