Does Schema Replace Conventional NAP in local SEO?
-
Hello Everyone,
My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO?
Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case.
Thanks, and I look forward to what the community has to say on this matter.
-
Marcus...that should'a been "...the ever DEPENDABLE Phil Rozek..."
-
No.
Schema and NAP are two distinct things, while schema can markup NAP elements, NAP as it is used in Local is more about ensuring you are using the same N, A, and P everywhere. The order of it is irrelevant; people are just accustomed to seeing it as Name, Address, Phone in that order.
If you only use schema on a web page to tell people where your business is, I am not sure they will find it easily. Will search engines find it? Yes.
I hope this clarifies a bit for you.
-
Im completely with Lesley here. That's exactly how I would do it. Give him a beer!
-
Hey Todd
You really don't get to decide this as it is the third party sites that will make up the lions share of your NAP data out there and whether they decide to use schema or not is up to them. When it comes to your own site it is nice and it has benefits but I would not bust a gut over it.
I would worry about NAP consistency and having well optimised citations (for users that is) in all the important general purpose, local and vertical directories long before I started to worry about schema.
That said, we use schema for the NAP on our clients sites as it is kind of easy after all.
This is a good article by the ever dependant Phil Rozek:
Cheers
Marcus
-
When I can I keep both, I try to keep the normal structure and mark it up with schema mark up. I am almost positive that technically you can include organizational data without it being visible on the site, but I generally think it is a good practice to keep it visible even if it is not for SEO. Think about how a meta description is, they basically hold no SEO value any more, but the value they hold is to get people to click through the SERP's to your site. Having the normal NAP structure holds the value of people trusting your business. Also, things can change, I do not want to be caught with my pants down if Google decides that organizational data needs to be displayed on the page and hidden data is no longer allowed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Novice SEO Question - UK & COM Results
Would someone please explain to me why when doing this search https://www.google.co.uk/search?pws=0&q=online+texas+hold+em there are uk and com pages ranking in the results for pokerstars & how do I fix it? Thank you!
Local Website Optimization | | charliegirlcontent1 -
More pages on website better for SEO?
Hi all, Is creating more pages better for SEO? Of course the pages being valuable content. Is this because you want the user to spend as much time as possible on your site. A lot of my competitors websites seem to have more pages than mine and their domain authorities are higher, for example the services we provide are all on one page and for my competitors each services as its own page. Kind Regards, Aqib
Local Website Optimization | | SMCCoachHire0 -
Multi location silo seo technique
A physical therapy company has 8 locations in one city and 4 locations in another with plans to expand. I've seen two methods to approach this. The first I feel is sloppy and that is the individual url for each location that points to from the location pages on the main domain. The second is to use the silo technique incorporated with metro scale addition. You have the main domain with the number of silos (individual stores) and each silo has its own content (what they do at each store is pretty much the same). My question is should the focus of each silo, besides making sure there is no duplicate copy, to increase their own hyperlocal outreach? Focus on social, reviews, content curated for the specific location. How would you attack this problem?
Local Website Optimization | | Ohmichael1 -
Using IP Detection to Filter Directory Listings without Killing Your SEO?
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible? Thanks! Jeremy
Local Website Optimization | | Jeremy_Lopatin0 -
International SEO Difficulty With Hreflang
Hi, It seems that multilingual sites can be very tricky sometimes. This is the second problem we are facing with a client this month... A company which already has a presence in Spain wants to expand now in Portugal, Brazil and Argentina. There are some linguistic differences between Spain Spanish and Argentina Spanish so we will have a slightly different content but same url (check below) We will also cover the linguistic differences between Portuguese and Brazilian but with different urls, so we will have 4 pages serving the same content in 3 ( technically 4 ) different languages: company.com/idioma -> (original Spain Spanish page - url stays the same.) company.com/es-ar/idioma (Argentina url) company.com/pt-pt/idioma (Portugal url) company.com/pt-br/lingua (Brazil url) Normally we know we should use alternate hreflang to all 4 pages, but now that the url changes, e.g between Argentina and Brazil, the case is the same or we can omit it for these two countries? Thank you!
Local Website Optimization | | Tz_Seo0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
SEO: .com vs .org vs .travel Domain
Hi there, I am new to MOZ Q&A and first of all I appreciate all the folks here that share their expertise and make everyone understand 'the WWW' a bit better. My question: I have been developing a 'travel guide' site for a city in the U.S. and now its time to choose the right domain name. I put a strong focus on SEO in terms of coding, site performance as well as content and to round things up I'd like to register the _best _domain name in terms of SEO. Let's suppose the city is Atlanta. I have found the following domain names that are available and I was wondering whether you guys could give me some inside on which domain name would perform best. discoveratlanta.org
Local Website Optimization | | kinimod
atlantaguide.org
atlanta.travel
atlantamag.com Looking at the Google Adwords Keyword tool the term that reaches the highest search queries is obviously "Atlanta" itself. Sites that are already ranking high are atlanta.com and atlanta.gov. So basically I am wondering whether I should aim for a new TLD like atlanta.travel or rather go with a .org domain. I had a look around and it seems that .org domains generally work well for city guides (at least a lot of such sites use .org domains). However, I have also seen a major US city that uses .travel and ranks first. On the other hand in New York, nycgo.com ranks well. Is it safe to assume that from the domain names I mentioned it really doesn't matter which one I use since it wouldn't significantly affect my ranking (good or bad)? Or would you still choose one above the other? What do you generally thing about .travel domain names (especially since they are far more expensive then the rest)? I really appreciate your response to my question! Best,
kinimod0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1