Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
-
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area).
We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool.
Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google.
Any insight would be very helpful.
Thanks
-
Excellent response Marcus. Thanks for your feedback.
-
Hey Rosemary
This is against the guidelines for local businesses so could be problematic
Ref: https://support.google.com/business/answer/3038177?hl=en-GB
"If your business rents a temporary, "virtual" office at an address that is different from your primary business, do not create a page for that location unless it is staffed during your normal business hours."
If we take a look at the schema page for a local business we can see that this markup is ideally for a physical business
ref: https://schema.org/LocalBusiness
"A particular physical business or branch of an organization. Examples of LocalBusiness include a restaurant, a particular branch of a restaurant chain, a branch of a bank, a medical practice, a club, a bowling alley, etc."
So - to go back to your question I am not sure it is spam as such but it is incorrect in how you are using the mark up. This is not a 'physical business or branch of an organisation' and as such the schema markup is used incorrectly. I would remove the schema markup in this instance.
However - there is no reason why you would not have a location pages for a service area - none at all and you can do these pages well and they can provide real value to a customer in the targeted location. They won't rank in the local pack but they may well rank in the organic results below (or be used for paid traffic)
I wrote about location landing pages in some depth here:
http://searchengineland.com/local-seo-landing-pages-2-0-222583Hope that helps clear things up for you.
Cheers
Marcus -
Hi there
I would take advantage of Schema that allows you to markup serviceArea, and also review Google's "Service-area businesses on Google" resource. It allows you to mark "I deliver goods and services to my customers at their locations". Please follow these rules. You shouldn't need to create a listing for each city you service if you tell Google via Google My Business and Schema.
Let me know if this makes sense or if you need anymore help! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Knowledge Graph Details can be changed through Knowledge Graph Schema
Hello, all! I have a client who's Fortune 500 - has all the good "stuff" that is associated with pulling in proper info into the knowledge graph/company information box - Wikipedia, strong citations, etc., but the CEO name is showing the old CEO name althopugh we haven't mentioned it in wiki neither on our website but still google is picking it from somewhere else & showing the previous CEO name. How can i change it? Thanks!
Local Website Optimization | | dhananjay.kumar10 -
Local SEO + Searcher Intent Targeting for Home Builder
Good Morning, All! I work for a home builder - www.HibbsHomes.com. Their site has hundreds of pages and blogs and I'm looking at consolidating many of them as they're older and use an older SEO strategy. Can you take a look at their portfolio? http://hibbshomes.com/custom-home-builders-st-louis/st-louis-custom-homes-portfolio/ I'm wondering if I should consolidate the various projects into their own pages by house type and city - rather than having all on one page? Both for SEO and for easier searchability. How would you organize this for these? The benefit to setting up city pages is the local SEO rank (St Louis has so many suburbs). The benefit to setting up pages by home style or size would be for user experience. How do I improve this for both? And... how do I optimize for conversions better?
Local Website Optimization | | stldanni1 -
Removed huge spammy location footer, looking to rebuild traffic the right way
Hello, On this site, I removed a huge spammy location footer with hundreds of cities, states, and dog training types. The traffic and rankings have gone down a lot, and I'd like a discussion on how to rebuild things the right way. There's some local adjustments to be made to the home page content, but other than that: My plans: 1. Analyze top 10 Google analytics keyword queries and work them into the content as best as possible, though I am debating whether the client should make new pages and how many. 2. I'm going to suggest he add a lot of content to the home page, perhaps a story about a dog training that he did in Wisconsin. I'll think about what else. Any advice is appreciated. Thanks.
Local Website Optimization | | BobGW0 -
Business in one location, be found in others?
Hi all, A bit of an interesting one but I am sure you can all help. My client has a business in a town called location A. Surrounding town A there are several other towns - My client wants to make sure they also appear in SERPs for these surrounding areas, even though their business is not physically located there. E.g. Product town A
Local Website Optimization | | HB17
Product town B
Product town C
Or even just being physically searching from one of those locations and typing the product name, they want to be on that first page. For example if you live in town B which is 20 miles away, my clients still wants to appear right at the top of the SERPs as they are competing against other businesses for that area. They also want to appear for town C, D, and E, all of which are surrounding town A. How can I make this happen? Would I need to create multiple landing pages and focus the SEO on each individual location? I'm just worried Google would see duplicate content but with varied location keywords. I don't have any room left in the page title to add every location. They do legitimately serve these areas, if you are looking for their product there are a few competitors around but this is in their 'territory' so to speak. Any help big or small would be great. Thanks!0 -
Keywords with locations
I've seen quite a few threads that orbit around my questions, but none in the last year, so I'll ask it 🙂 I'm seeing some strange results when testing various keywords with and without locations included. For a foundation repair company in Indiana, we've optimized for all the big cities, since the company services the whole state. Here's a sample of weird stuff: Test 1: If I set my location (all other Google 'helps' turned off) to Indianapolis and search 'foundation repair' result is #3 'foundation repair indianapolis' result is #20 'indiana foundation repair' result is #18 Test 2: Location set to the small town the company is based in (Rossville, IN) 'foundation repair' result is #1 'foundation repair rossville' result is #3 behind other companies located in Rossville, GA, and Rossville, PA!! I suppose I was under the impression that the ip location data Google gathers would weigh more heavily than how place names are optimized as part of keywords (or just that the physical location would supplant the place name typed into the search if it happened to be the same). But according to these tests, it seems that inferred location is by far a secondary factor. I can deduce that we're more optimized than our competitors for 'foundation repair', but less optimized for keywords with place names in them (we feel like we'd be verging on stuffing if we did more). Am I missing something here? Has anyone else seen this sort of thing?
Local Website Optimization | | clearlyseo0 -
One location performing worse than the rest despite no major difference in SEO strategy
Hi all, I'm flummoxed. I'm dealing with a business that has 15 or so offices in three cities, and one city is performing horribly (this includes every office therein). The other two cities have shown consistently stellar results with massive traffic increases month over month for the past year; the city in question dropped unexpectedly in June and hasn't ever recovered. We didn't perform any major website changes during or immediately prior to that time period, and the website in general hasn't been negatively affected by Hummingbird. All locations for the business are optimized in the exact same way and according to best practices; there's no significant difference in the number of local listings, reviews, G+ fans, social signals, etc across locations. All meta data and content is optimized, NAPs are all consistent, we've built links wherever we can: the SEO for every location has been by-the-books. We've run a competitor audit in this particular city that included pulling our top competitors and exploring their domain authority, meta data, on-page keyword grade for the term we're trying to rank for, number and type of inbound links, social signals, and more; and we didn't spot any patterns or any websites that were significantly outperforming us in any area (besides actual rankings). It's frustrating because the client is expecting a fix for this city and I can't find anything that needs to be fixed! Have any multi-local SEOs out there run into a similar problem? What did you do about it?
Local Website Optimization | | ApogeeResults0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0