Why does Google recommend schema for local business/ organizations?
-
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets".
Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks?
I would be interested to hear what everyone else opinions are on this thanks.
-
Okay thank you so much Miriam!
-
Hi Peter,
Thank so much for the live example. I totally get what you mean now. Okay, so the example you are showing is from TripAdvisor, and yes, Google consistently displays stars and review counts, etc., for TripAdvisor-based results, and for other large sites like Yelp. I presume (but am not certain) that these are rich snippets. Here are 2 articles from Google on this subject:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146645
http://maps.google.com/help/maps/richsnippetslocal/
In mid-2012, Google stopped showing stars on their own results after switching to Zagat as their provider. See:
http://blumenthals.com/blog/2012/06/12/google-we-can-show-stars-if-we-want-to/
The most public test case of rich snippets appearing for a small local business was Mike Blumenthal's writeup of getting stars and other data to appear for a jeweler client of his. However, this data then disappeared, only to reappear sporadically some months later. Read this:
http://blumenthals.com/blog/2012/08/09/are-rich-snippet-reviews-making-a-limited-comeback-in-local/
So, sometimes Google will still display this type of data for small businesses alongside their organic results, but it is sporadic. Looking at that same client of Mike's today, I don't see any stars, but who knows, they could come back again tomorrow. Mike's opinion is that is still makes sense to mark up pages, and his advice is trustworthy.
And, of course, you do have the option of listing your local business on entities that consistently do show stars (like Yelp).
Hope this helps, and thanks again for the screenshot!
-
http://cdnext.seomoz.org/1347566301_84f5cacc41479945c65eea948eb9a2d8.jpg
Here is the link its the first result.
-
I want something like this, so basically if someone searches Car Dealer. That this would appear after the Local Search results. Is this possible for a local business, After the local search results? Or, is it only possible in the local search results not the purely just organic results? I hope that solves it thanks.
Sorry for the text formatting, it won't let me change it.
-
Hi Peter,
Can you find a live example of a business that is achieving what you're hoping to achieve and share it with me? I want to be sure I understand, and without seeing an example, I'm not clear on how best to advise you. Thanks!
-
Hi Miriam, sorry for the confusion. So, do you not think it would be beneficial to have schema for the properties of a place, as it would allow me to show the business reviews rating/ aggregate rating in the search? Basically I want it so when someone searches something relating to a car dealer. That if Google chooses to show my URL then it will be accompanied by the business reviews rating/ aggregate rating in the search. I feel it would help to increase the CTR. I hope that makes sense.
-
Hi Peter, I'm not sure I understand your question. You write: "Many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. " Do you mean when you search for the restaurant, you are seeing the local pack of results (meaning accompanied by the grey pin and link to the Google+ Local page)? If so, that has nothing to do with Schema. The local listings stem from Google Places/Google+ Local, not from whatever schema you've embedded in a website. I use schema for my local business clients and understand the point of it to be to strengthen the geographic signals the website is sending to Google regarding the client's NAP (name, address, phone number). The goal is not to have that info show up in the SERPs (unlike rel=author or something like that). Does this answer your question? If not, please provide further detail.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
How the heck is this guy ranking on top of Google for everything?
Hey everyone, how is the website below ranking so high for everything with his website? His link profile is spam junk, he uses forums and hides backlinks in smiles and quotes. Plus the guy even seems to be hitting all the competition websites with bad backlinks etc. It seems he is jus using automated tools to build tons of backlinks. Why isn't Google picking this site up and doing something about it? Search google for "advanced warfare hacks" he shows up on top. Same for "titanfall hacks" Same for "ghosts hacks" Check his link profile and sneaky ways, his main site is hackerbot [dot] net
White Hat / Black Hat SEO | | Draden670 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Google Local Listing Verification - Is there a way to skip this?
Hi, We are running 2 types of service in our company. 1.) Dry Cleaning 2.) Laundry Services The problem is we have 2 website but only 1 office address.
White Hat / Black Hat SEO | | chanel27
It is not recommended to put same address for the both websites
both doing laundry & dry cleaning services. Is there any tip on how we can get listed on Google place without using the same address for both website?0 -
Google Penguin for non-English queries?
Does anybody know if non-English queries were also 'hit' by the Google Penguin update? All Penguin horror stories out there are from sites focusing on English queries, and in some (Dutch) industries I'm monitoring, some sites with spammy backlink profiles are still ranking.
White Hat / Black Hat SEO | | RBenedict0 -
Publishing Press Releases after Google Panda 2.5
For the past few years I have been publish press releases on my site for a number of business. I have high traffic on my site. I noticed that with the Google Panda 2.5 update PRNewswire.com dropped visibility by 83%. Should I stay away from publishing press releases now? Does Google consider Press Releases to be "content scraping" since multiple sources are publishing the release?
White Hat / Black Hat SEO | | BeTheBoss2