Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
A few questions on Google's Structured Data Markup Helper...
-
I'm trying to go through my site and add microdata with the help of Google's Structured Data Markup Helper. I have a few questions that I have not been able to find an answer for. Here is the URL I am referring to: http://www.howlatthemoon.com/locations/location-chicago
- My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant?
- It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it?
- How do I add reviews? Do they have to be on the page?
- If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out?
- I have events on this page. However, when I tried to do the markup for just the event it told me to use itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body?
Any other tips would be much appreciated. Thanks!
-
Howl,
1. I would mark all up as a food establishment. Under menu/cuisine where you serve no food I would simply put an alcohol menu or link or on cuisine put we server beer, wine, tequila. etc. The reason is this: you are using schema because it is good for you and the search engines. Within the various schema are other properties. So for FoodEstablishment there is ... Bar or Pub. A bar or pub is a food establishment. So is an ice cream shop, winery, brewery, etc.
2. This is the page url. For me in Houston... http://www.howlatthemoon.com/locations/location-houston
3. I suggest you put in your own review schema and make the site where your customers can review you. Why would you want to serve yelp?? What do they do for you? Your site is using Joomla and with a quick check I found several review plugins you could utilize to make life simpler. We do not use Joomla as much but we often use a review plugin with our WP sites. And, it passes the markup test with Google.
4. Opening Hours - Here is the schema for opening hours. Pretty easy. If you look at the bottom of Local Business, you will see what will or will not work.
5. Here is the actual event schema from Schema.org
-
Upcoming shows
-
<div< span="">itemprop="event"itemscopeitemtype="http://schema.org/Event"></div<>
-
<a< span="">href="foo-fighters-may20-fedexforum"itemprop="url"></a<>
-
<span< span="">itemprop="name">FedExForum</span<>
-
<span< span="">itemprop="location">Memphis, TN, US</span<>
-
<meta< span="">itemprop="startDate"content="2011-05-20">May 20</meta<>
-
<a< span="">href="ticketmaster.com/foofighters/may20-2011"itemprop="offers">Buy tickets</a<>
-
<div< span="">itemprop="event"itemscopeitemtype="http://schema.org/Event"></div<>
-
<a< span="">href="foo-fighters-may23-midamericacenter"itemprop="url"></a<>
-
<span< span="">itemprop="name">Mid America Center</span<>
-
<span< span="">itemprop="location">Council Bluffs, IA, US</span<>
-
<meta< span="">itemprop="startDate"content="2011-05-23">May 23</meta<>
-
<a< span="">href="ticketmaster.com/foofighters/may23-2011"itemprop="offers">Buy tickets</a<>
So, you are going down the right path, I have never been to Howl in Houston but will check it out.
Best,
Robert
-
-
You can do either but the net effect is different. If you set things up for reviews on your page you will get these reviews indexed as content on your page and if marked up properly this will index on the major review sites(excluding Yelp I believe). You also have the advantage of control over what is or is not published. The down side to this is that you will need someone with skills to program this and do the mark up properly. You also need someone managing this prt of the web site.
The other way is to use an embedded link for the review sites. This will get the reviews done off of your site. These off site reviews can create inbound links. For Yelp this is pretty much the only way to do it.
I would factor in the review sites people use for Bars in Chicago, is it Redeye, Yelp, or Citysearch for example. I would set up my site that favors where people look for reviews when deciding whether or not to come to your business.
Hope this is clearer than mud.
Ron
-
Thanks for the response. So to clarify, to use the reviews I must have the actual reviews on the page. They cannot simply be linked to without putting them on the actual page?
-
Ok let's go through these questions on by one below:
My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant?
I would mark the locations that have food as restaurants and the ones that do not as local business. I would set up and write a unique description for each location.
It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it?
You can either link to your off site reviews sites by getting an embed code or you can write some code with structured mark up to embed reviews into your web site that will be indexed by the search engines. There are references on line about how to do this.
How do I add reviews? Do they have to be on the page?
Look at answer above. As a sidenote you should make it a mission to get 10 reviews on Yelp, Google, and Bing for each location as this helps your local results.
If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out?
Yes
I have events on this page. However, when I tried to do the markup for just the event it told me to use itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body?
Can't help you on this one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
How many images should I use in structured data for a product?
We have a basic printing website that offers business cards. Each type of business card has a few product images. Should we use structured data for all the images, or just the main image? What is your opinion about this? Thanks in advance.
Intermediate & Advanced SEO | | Choice0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Soft 404's from pages blocked by robots.txt -- cause for concern?
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?
Intermediate & Advanced SEO | | nicole.healthline4 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0