Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
-
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area).
We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool.
Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google.
Any insight would be very helpful.
Thanks
-
Excellent response Marcus. Thanks for your feedback.
-
Hey Rosemary
This is against the guidelines for local businesses so could be problematic
Ref: https://support.google.com/business/answer/3038177?hl=en-GB
"If your business rents a temporary, "virtual" office at an address that is different from your primary business, do not create a page for that location unless it is staffed during your normal business hours."
If we take a look at the schema page for a local business we can see that this markup is ideally for a physical business
ref: https://schema.org/LocalBusiness
"A particular physical business or branch of an organization. Examples of LocalBusiness include a restaurant, a particular branch of a restaurant chain, a branch of a bank, a medical practice, a club, a bowling alley, etc."
So - to go back to your question I am not sure it is spam as such but it is incorrect in how you are using the mark up. This is not a 'physical business or branch of an organisation' and as such the schema markup is used incorrectly. I would remove the schema markup in this instance.
However - there is no reason why you would not have a location pages for a service area - none at all and you can do these pages well and they can provide real value to a customer in the targeted location. They won't rank in the local pack but they may well rank in the organic results below (or be used for paid traffic)
I wrote about location landing pages in some depth here:
http://searchengineland.com/local-seo-landing-pages-2-0-222583Hope that helps clear things up for you.
Cheers
Marcus -
Hi there
I would take advantage of Schema that allows you to markup serviceArea, and also review Google's "Service-area businesses on Google" resource. It allows you to mark "I deliver goods and services to my customers at their locations". Please follow these rules. You shouldn't need to create a listing for each city you service if you tell Google via Google My Business and Schema.
Let me know if this makes sense or if you need anymore help! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
JSON Schema Script Closing Tags
Hello I can't get the following script to work. There seems to be something with the closing tags I've tried various combinations, however, no luck.
Local Website Optimization | | Marge_Blizzard0 -
Are local business directories worth the effort? Eg. White pages, Yell.com, Local.com?
Hi Guys, Im new to Moz and very keen to do SEO right without upsetting Mr. Google too much. Are local business directories worth the effort? Its a laborious job, but happy to do it, if its effective and won't be considered spammy by Google? Thanks
Local Website Optimization | | Fetseun0 -
Local Search Location Keyword Use
Hello. Whats the best way to approach the use of location phrases within the page content itself? Say your based in a large city but also work in smaller surrounding areas, would you target the main location i.e. "London" on the home page and the main product/service pages directly. Or would you leave this all to deeper pages where you can more easily add value? I can imagine that the inclusion of the location i.e. "London" might compromise the quality of the writing. And put off the users from other locations. For example on the Home Page if your targeting:
Local Website Optimization | | GrouchyKids
Keyword: Widgets
Location: London Widgets in London and Beyond For the best Widgets in London come to... And for a key product or service page if your targeting:
Keyword: Car Widgets
Location: London Car Widgets London and Beyond For the best Car Widgets in London come to... On deeper pages its going to be easier to make this work, but how would you approach it on the main pages and homepage? Hope that all makes sense?0 -
How does duplicate content work when creating location specific pages?
In a bid to improve the visibility of my site on the Google SERP's, I am creating landing pages that were initially going to be used in some online advertising. I then thought it might be a good idea to improve the content on the pages so that they would perform better in localised searches. So I have a landing page designed specifically to promote what my business can do, and funnel the user in to requesting a quote from us. The main keyword phrase I am using is "website design london", and I will be creating a few more such as "website design birmingham", "website design leeds". The only thing that I've changed at the moment across all these pages is the location name, I haven't touched any of the USP's or the testimonial that I use. However, in both cases "website design XXX" doesn't show up in any of the USP's or testimonial. So my question is that when I have these pages built, and they're indexed, will I be penalised for this tactic?
Local Website Optimization | | mickburkesnr0 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
Australian local business website on a dot.com - how do I ensure its indexed/ranked by Google.com/au as priority
look forward to your advice My client is a local business in australia but has a dotcom site which is hosted in US. We are just moving it to wordpress and new hosting. I want to ensure that Google.com/au will be able to index and rank the content. How can I tell google its a site for people in australia? I thought best to set up a subfolder like this hissite.com/au and redirect anyone from australia to go to this url? Thanks for your recommendations
Local Website Optimization | | bisibee10 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local Ranking Power of a Multi-Keyword URL?
Here is a site that is sitting at number 1 on Google UK (local results) for a number of its keywords: http://www.scottishdentistry.com/ If you look at the links in the navigation many of them have urls such as this: http://www.scottishdentistry.com/glasgow-scotland-dentistry/glasgow-scotland-hygienists.html These have clearly been created to be keyword rich. For example, there is no publicly-available page at: http://www.scottishdentistry.com/glasgow-scotland-dentistry Do you think this tactic has helped with the site's rankings? Is it worth imitating? Or will it ultimately attract a penalty of some kind? Remember this is in the UK where Google seems to be slower at penalising dodgy tactics than in the US. Thanks everyone.
Local Website Optimization | | neilmac0