Why does Google recommend schema for local business/ organizations?
-
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets".
Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks?
I would be interested to hear what everyone else opinions are on this thanks.
-
Okay thank you so much Miriam!
-
Hi Peter,
Thank so much for the live example. I totally get what you mean now. Okay, so the example you are showing is from TripAdvisor, and yes, Google consistently displays stars and review counts, etc., for TripAdvisor-based results, and for other large sites like Yelp. I presume (but am not certain) that these are rich snippets. Here are 2 articles from Google on this subject:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146645
http://maps.google.com/help/maps/richsnippetslocal/
In mid-2012, Google stopped showing stars on their own results after switching to Zagat as their provider. See:
http://blumenthals.com/blog/2012/06/12/google-we-can-show-stars-if-we-want-to/
The most public test case of rich snippets appearing for a small local business was Mike Blumenthal's writeup of getting stars and other data to appear for a jeweler client of his. However, this data then disappeared, only to reappear sporadically some months later. Read this:
http://blumenthals.com/blog/2012/08/09/are-rich-snippet-reviews-making-a-limited-comeback-in-local/
So, sometimes Google will still display this type of data for small businesses alongside their organic results, but it is sporadic. Looking at that same client of Mike's today, I don't see any stars, but who knows, they could come back again tomorrow. Mike's opinion is that is still makes sense to mark up pages, and his advice is trustworthy.
And, of course, you do have the option of listing your local business on entities that consistently do show stars (like Yelp).
Hope this helps, and thanks again for the screenshot!
-
http://cdnext.seomoz.org/1347566301_84f5cacc41479945c65eea948eb9a2d8.jpg
Here is the link its the first result.
-
I want something like this, so basically if someone searches Car Dealer. That this would appear after the Local Search results. Is this possible for a local business, After the local search results? Or, is it only possible in the local search results not the purely just organic results? I hope that solves it thanks.
Sorry for the text formatting, it won't let me change it.
-
Hi Peter,
Can you find a live example of a business that is achieving what you're hoping to achieve and share it with me? I want to be sure I understand, and without seeing an example, I'm not clear on how best to advise you. Thanks!
-
Hi Miriam, sorry for the confusion. So, do you not think it would be beneficial to have schema for the properties of a place, as it would allow me to show the business reviews rating/ aggregate rating in the search? Basically I want it so when someone searches something relating to a car dealer. That if Google chooses to show my URL then it will be accompanied by the business reviews rating/ aggregate rating in the search. I feel it would help to increase the CTR. I hope that makes sense.
-
Hi Peter, I'm not sure I understand your question. You write: "Many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. " Do you mean when you search for the restaurant, you are seeing the local pack of results (meaning accompanied by the grey pin and link to the Google+ Local page)? If so, that has nothing to do with Schema. The local listings stem from Google Places/Google+ Local, not from whatever schema you've embedded in a website. I use schema for my local business clients and understand the point of it to be to strengthen the geographic signals the website is sending to Google regarding the client's NAP (name, address, phone number). The goal is not to have that info show up in the SERPs (unlike rel=author or something like that). Does this answer your question? If not, please provide further detail.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Google not disavow some bad links
I have submitted bad links that I want to disavow on google with the Moz Pro hight spam score. Its almost 4 months completed yet I have a bad link that exists with high spam score any solution? https://fortniteskinsgenerator.net/
White Hat / Black Hat SEO | | marktravis0 -
Search ranking for a term dropped from 1st/2nd to 106th in 3 months
Hello all, Just a couple notes first. I have been advised to be vague on the search term we've dropped on (in case this page ranks higher than our homepage for it). If you search for my name in Google though you should be able to figure out where I work (I'm not the soccer player). While I am looking for an answer, I've also posted this question on a couple other forums (see https://www.webmasterworld.com/google/4934323.htm and https://productforums.google.com/forum/?utm_medium=email&utm_source=footer#!msg/webmasters/AQLD7lywuvo/2zfFRD6oGAAJ) which have thrown up more questions than answers. So I have posted this as a discussion. We've also been told we may have been under a negative SEO attack. We saw in SEMRush a large number of backlinks in October/November/December - at about the same time we disavowed around 1m backlinks (more on this below) but we can't see this reflected in Moz. We just got off a call with someone at Moz to try and work this out and he suggested we post here - so here goes... On 4th October for the search term 'example-term' we dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console). We also paid an external SEO consultant to review our site and see why we are dropping on the term 'example-term'. We've implemented everything and we're still dropping, the consultant thinks we may have been penalised in error (as we are a legitimate business and we're not trying to do anything untoward). In search console you could see from the graphs on the term we used to rank 1st and 2nd (you could go back 2 or 3 years and still see this). The thing we do find confusing is that we still rank very highly (if not 1st) for 'example-term + uk' and our brand name - which is very similar to 'example-term'. Timeline of events of changes: 2nd October 2018 midday: Added a CTA using something called Wisepops over the homepage - this was a full screen CTA for people to pledge on a project on our site helping with the tsunami in Indonesia (which may have had render blocking elements on). 4th October: we added a Google MyBusiness page showing our corporate headquarters as being in the UK (we did flag this on the Google MyBusiness forums and both people who responded said adding a MyBusiness page would not affect our drop in rankings). 4th October: dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console) 4th October: Removed the Wisepops popup 5th November: Server redirect so anything coming in on / was redirected to a page without a / 12th November: Removed around 200 junk pages (so old pages, test cms pages etc that were live and still indexed). Redirects from any 404s resolved 19th November: Updated site maps and video site maps to reflect new content and remove old content. Reviewed the whole site for duplicate meta tags and titles and updated accordingly with unique ones. Fixed issues in Google Search Console for Google search console for 404 and Mobile usability. Removed embedded YouTube video from homepage. 11th December: Removed old content and content seen as not useful from indexing; 'honey pot' pages, old blog, map pages, user profile pages, project page ‘junk pages which have little SEO value’ (comments, contact project owner, backers, report project) from indexing, added ‘no-follow’ to widgets linking back to us 3rd January 2019: Changed the meta title from to remove 'example-term' (we were concerned it may have been seen as keyword stuffing) 7th January: Disavow file updated to refuse a set of external sites powered by API linking to us (these were sites like example-term.externalsite.co.uk which used to link to us showing projects in local areas - our SEO expert felt may be seen as a ‘link farm’) 11th January: Updated our ‘About us’ page with more relevant content 15th January: Changed homepage title to include 'example-term' again, footer links updated to point to internal pages rather than linking off to Intercom, homepage ordering of link elements on homepage changed (so moving external rating site link further down the page, removing underlines on one item that was not a link, fixed and instance where two h1 tags were used), removed another set of external Subdomains (i.e. https://externalsite.sitename.co.uk) from our system (these were old sites we used to run for different clients which has projects in geographical areas displayed) 18th January: Added the word 'example-term' to key content pages We're at a loss as to why we are still dropping. Please note that the above changes were implemented after we'd been ranking fine for a couple years on the 'example-term' - the changes were to try and address the drop in ranking. Any advice would be greatly appreciated.
White Hat / Black Hat SEO | | Nobody15554510997900 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Schema markup abuse for ratings
A competitor recently jumped very high in the SERPS after adding Ratings markup schema to his site. His site shows in the serps with a 5 star rating for a software product. Here's his site: http://bit.ly/11hp2KX The source of the "rating" appears just to be hard coded schema markup, not connected to anything external or impartial.It appears at this point that Google is taking this markup data at face value, and maybe is giving it some authorityHave you seen this kind of abuse in your vertical?
White Hat / Black Hat SEO | | DarrenX0 -
Google messages & penalties
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity: "...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords." This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
White Hat / Black Hat SEO | | gfiedel0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
EMD with 3.3million broad match searches got hit hard by Panda/Penguin
k, so I run an ecommerce website with a kick ass domain name. 1 keyword (plural)
White Hat / Black Hat SEO | | SwissNinja
3.3 million broad match searches (local monthly)
3.2 million phrase match
100k exact match beginning of march I got a warning in GWT about unnatural links. I feel pretty certain its a result of an ex-employee using an ALN listing service to drip spun article links on splogs. This was done also for another site of mine, which received the same warning, except bounced back much sooner (from #3 for EMD w/ 100k broad, 60k phrase and 12k exact, singular keyword phrase) I did file reinclusion on the 2nd (smaller) domain. Received unnatural warning on 4/13 and sent reconsideration on 5/1 (tune of letter is "I have no clue what is up, I paid someone $50 and now Im banned) As of this morning, I am not ranking for any of my terms (had boucned back on main keyword to spot #30 after being pushed down from #4) now back to the interesting site....
this other domain was bouncing between 8-12 for main keyword (EMD) before we used ALN.
Once we got warning, we did nothing. Once rankings started to fall,we filed reinclusion request...rankings fell more, and filed another more robustly written request (got denials within 1 week after each request)until about 20 days ago when we fell off of the face of the earth. 1- should I take this as some sort of sandbox? We are still indexed, and are #1 for a search on our domain name. Also still #1 in bing (big deal) 2- I've done a detailed analysis of every link they provide in GWT. reached out to whatever splog people I could get in touch with asking them to remove articles. I was going to file another request if I didn't reappear after 31 days after I fell off completely. Am I wasting my time? there is no doubt that sabatoge could be committed by competition by blasting them with spam links (previously I believed these would just be ignored by google to prevent sabatoge from becoming part of the job for most SEOs) Laugh at me, gasp in horror with me, or offer some advice... I'm open to chat and would love someone to tell me about a legit solution to this prob if they got one thanks!0 -
Is Google stupid?
Why does buying links still work? I don't mean approaching an individual webmaster and cutting a deal, that seems to be nearly impossible to detect. But the huge link brokers, like Text Link Ads, Build my Rank or Linkvine, Google has to be aware of them, right? Can't they just create accounts to see the whole network, and ban the sites? Why wouldn't they just do that?
White Hat / Black Hat SEO | | menachemp0