NAP - is lack of consistency in address elements an issue?
-
I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London)
For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses).
So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not.
Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?
-
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
-
Hi Luke,
It's a complex topic. I think you'll find this Matt McGee article from SmallBusinessSEM and this one from Marcus Miller at Search Engine Land extremely helpful. Both talk about how to optimize multi-location businesses and very specifically about data consistency and does Google pay attention to slight variations like the one you described in your question where the addresses are never wrong, just "mixed up a little".
"... for the most part, the algo handles those minor discrepancies well. That being said, you don’t want to tempt fate."
-
Yes sorry it needed clarification - was struggling to describe the issue - what you suggest sounds like a good idea, indeed - I will put a complete NAP only at the top of each of the 8 main landing pages, in Schema, along with a calendar on each landing page linking to the class descriptions. Many thanks for your help with this - much appreciated
-
Ah, got it, Luke! Thanks for clarifying. It seems to me, then, that what you might need is some kind of a calendar on the main city landing page for each location that links to the different class descriptions. Would this be a way to format 38 different links so that customers can understand them easily and see what's available? Just a thought!
-
Hi Miriam - yes the 38 pages have been created about the services from each specific location (in this case health and fitness classes) - the classes are specific to that location, so each of the run of 38 pages are specific to a specific location, so there would a strong contextual relationship. Basically the 38 pages are specific to classes unique to that location (in terms of times, tutors and often type).
So I guess the whole idea of whether to do a specific footer for each locational section was what was humming around in my brain, with the specific address relevant to the content above, in the footer, rather than all 8 business locations consistently in the footer.
I was originally thinking of adding all 8 business addresses consistently in the footer, though I thought perhaps specific addresses may be more user friendly, and may even help Google understand the locational context.
-
Hi Luke,
Hmm ... that doesn't sound right to me. I may be missing something, but unless these 38 pages for each location have genuinely been created about the location and things relating specifically to it, I would not stick the NAP on there, just for the sake of putting it on a bunch of pages. What you're describing to me sounds like some kind of afterthought.
I also wouldn't change the footer around like that. It could create usability difficulties if it's changing throughout the site. Rather, my preference would be complete NAP only at the top of a single landing page per physical location, and NAP of all 8 businesses consistently in the sitewide footer. And, again, NAP of all 8 on the Contact page.This is what I consider to be the normal structure.
As for what to do with those several hundred pages, are they of really high quality? Are they city-specific or just generic to the business' topic? An example of city-specific might be something like a website for an arborist. He has a page for City A talking about how Dutch Elm Disease has hit that city. For City B, he has a page about birch tree borers that have affected that city's trees. So, from the main city A landing page, he could link to the Dutch Elm piece and for the main city B landing page, he could link to the birch borer page, as additional resources.
But if the content is just generic and you're trying to divvy it up between the cities, if there's not a strong contextual relationship, then there isn't really a good reason for doing so.
-
Hi Miriam,
What I meant is there are 8 business locations and the site's 300 odd pages are divided into these 8 (so each geographical location has around "38 pages" dedicated to that specific location and its services).
So what I was planning to do was simply put the correct location-specific NAP in the footer of each of the location-specific pages (so each run of location-specific "38 pages" will have the relevant [single] NAP in the footer of every page).
But my co-worker said only put the correct [single] NAP in the footer of the 8 location home(/landing) pages within the site, rather than on every page.
Hope that makes sense [it's been a long week ;-I]
-
(Miriam responding here, but signed into Mozzer Alliance right now)
Hi Luke,
If you mean in the footer and it's 10 or less locations, I'd say it's okay to put the NAP for the 8 businesses there, but not in the main body of the page.
My preferred method would be to put the complete NAP, in Schema, for Location A at the top of City Landing Page A, complete NAP for Location B at the top or City Landing Page B, etc. I would not suggest putting all of this NAP anywhere else on the site but the Contact Page.
-
Thanks Miriam - it sure does - their website is divided up by location, so I'm planning to put the relevant NAP at the bottom of every page through the website (8 locations and NAPs in total - 300 pages) - a colleague suggested just puting the NAP on each of the 8 location homepages though I suspect it would help more if the NAP was at foot of every page (so long as the correct NAP on the correct page ha!) - is that the right thing to do?
-
Hey Luke!
NAP consistency was judged to be the second most influential pack ranking factor on this year's Local Search Ranking Factors (https://moz.com/local-search-ranking-factors) so, yes, it's of major importance! Hope this helps.
-
When it comes to NAP, it should be as close to an exact match as you're able to achieve. Inconsistency in this area - while not the biggest detriment you can have - should be avoided.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
New domain purchase 301 and 404 issues. Please help!
We recently purchased www.carwow.com and 301 redirected the site to www.carwow.co.uk (our main domain). The problem is that carwow.com had URLs indexed like www.carwow.com/a-b-c the 301 sends them to carwow.co.uk/a-b-c which obviously doesn't exist so is a 404! What should be done in this situation? Should it be ignored and not re-directed at all, or is there a way to delete/disavow these dead pages? An SEO has advised we redirect all pages to the homepage, but won't that mess up the link profile? Any advice would be great!
Intermediate & Advanced SEO | | JamesPursey0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Geo-targeting Content Based On IP address?
What are the benefits / disadvantages of geo-targeting content based on IP address. A client is interested in serving up different content on their homepage based on what area the user is coming from. This seems like an SEO nightmare to me as search engine spiders could potentially see different content depending on when they visit. Is there a best practices here? Or is it looked down upon in regards to SEO? Any information would be helpful.
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Separate IP Address for Blog
Our developers are recommending we sign up for a cloud based LAMP (Linux, Apache, MySQL, & PHP) server to install 3<sup>rd</sup> party software (Wordpress). They said "the blog will be on a separate IP address and potentially can have some impact with SEM/SEO." Can anyone expand on what impact this might have versus having it on the same IP?
Intermediate & Advanced SEO | | pbhatt0