NAP - is lack of consistency in address elements an issue?
-
I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London)
For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses).
So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not.
Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?
-
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
-
Hi Luke,
It's a complex topic. I think you'll find this Matt McGee article from SmallBusinessSEM and this one from Marcus Miller at Search Engine Land extremely helpful. Both talk about how to optimize multi-location businesses and very specifically about data consistency and does Google pay attention to slight variations like the one you described in your question where the addresses are never wrong, just "mixed up a little".
"... for the most part, the algo handles those minor discrepancies well. That being said, you don’t want to tempt fate."
-
Yes sorry it needed clarification - was struggling to describe the issue - what you suggest sounds like a good idea, indeed - I will put a complete NAP only at the top of each of the 8 main landing pages, in Schema, along with a calendar on each landing page linking to the class descriptions. Many thanks for your help with this - much appreciated
-
Ah, got it, Luke! Thanks for clarifying. It seems to me, then, that what you might need is some kind of a calendar on the main city landing page for each location that links to the different class descriptions. Would this be a way to format 38 different links so that customers can understand them easily and see what's available? Just a thought!
-
Hi Miriam - yes the 38 pages have been created about the services from each specific location (in this case health and fitness classes) - the classes are specific to that location, so each of the run of 38 pages are specific to a specific location, so there would a strong contextual relationship. Basically the 38 pages are specific to classes unique to that location (in terms of times, tutors and often type).
So I guess the whole idea of whether to do a specific footer for each locational section was what was humming around in my brain, with the specific address relevant to the content above, in the footer, rather than all 8 business locations consistently in the footer.
I was originally thinking of adding all 8 business addresses consistently in the footer, though I thought perhaps specific addresses may be more user friendly, and may even help Google understand the locational context.
-
Hi Luke,
Hmm ... that doesn't sound right to me. I may be missing something, but unless these 38 pages for each location have genuinely been created about the location and things relating specifically to it, I would not stick the NAP on there, just for the sake of putting it on a bunch of pages. What you're describing to me sounds like some kind of afterthought.
I also wouldn't change the footer around like that. It could create usability difficulties if it's changing throughout the site. Rather, my preference would be complete NAP only at the top of a single landing page per physical location, and NAP of all 8 businesses consistently in the sitewide footer. And, again, NAP of all 8 on the Contact page.This is what I consider to be the normal structure.
As for what to do with those several hundred pages, are they of really high quality? Are they city-specific or just generic to the business' topic? An example of city-specific might be something like a website for an arborist. He has a page for City A talking about how Dutch Elm Disease has hit that city. For City B, he has a page about birch tree borers that have affected that city's trees. So, from the main city A landing page, he could link to the Dutch Elm piece and for the main city B landing page, he could link to the birch borer page, as additional resources.
But if the content is just generic and you're trying to divvy it up between the cities, if there's not a strong contextual relationship, then there isn't really a good reason for doing so.
-
Hi Miriam,
What I meant is there are 8 business locations and the site's 300 odd pages are divided into these 8 (so each geographical location has around "38 pages" dedicated to that specific location and its services).
So what I was planning to do was simply put the correct location-specific NAP in the footer of each of the location-specific pages (so each run of location-specific "38 pages" will have the relevant [single] NAP in the footer of every page).
But my co-worker said only put the correct [single] NAP in the footer of the 8 location home(/landing) pages within the site, rather than on every page.
Hope that makes sense [it's been a long week ;-I]
-
(Miriam responding here, but signed into Mozzer Alliance right now)
Hi Luke,
If you mean in the footer and it's 10 or less locations, I'd say it's okay to put the NAP for the 8 businesses there, but not in the main body of the page.
My preferred method would be to put the complete NAP, in Schema, for Location A at the top of City Landing Page A, complete NAP for Location B at the top or City Landing Page B, etc. I would not suggest putting all of this NAP anywhere else on the site but the Contact Page.
-
Thanks Miriam - it sure does - their website is divided up by location, so I'm planning to put the relevant NAP at the bottom of every page through the website (8 locations and NAPs in total - 300 pages) - a colleague suggested just puting the NAP on each of the 8 location homepages though I suspect it would help more if the NAP was at foot of every page (so long as the correct NAP on the correct page ha!) - is that the right thing to do?
-
Hey Luke!
NAP consistency was judged to be the second most influential pack ranking factor on this year's Local Search Ranking Factors (https://moz.com/local-search-ranking-factors) so, yes, it's of major importance! Hope this helps.
-
When it comes to NAP, it should be as close to an exact match as you're able to achieve. Inconsistency in this area - while not the biggest detriment you can have - should be avoided.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
DeepCrawl Calls Incomplete Open Graph Tags and Missing Twitter Cards An Issue. How important is this?
Hi, Let me first say that I really like the tool DeepCrawl. So, not busting on them. More like I'm interested in the relative importance of two items they call as "Issues." Those items are "Incomplete Open Graph Tags" and "No Valid Twitter Cards." They call this out on every page. To define it a bit further, I'm interested in the importance as it relates to organic search.I'm also interested in there's some basic functionality we may have missed in our Share42 implementation. To me, it looks like the social sharing buttons work. Also, we use Share42 social sharing buttons, which are quite functional. If it would help, I could private message you an example url. Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Including FAQ as Invividual Blog Posts Without Duplicate Issues
My website's FAQ section has a lot of detailed answers, of which I want to upload most on an individual basis to my blog. Example: I may have 30 FAQ and I want to upload 28 of these FAQ as individual blog posts, as it could be good additional search traffic. Question: how do I deal with duplicate content issues? Do I Include canonical? The FAQ are all on the same URL - not separate URL's - which means each blog post would only represent a small % of the entire FAQ section, though each blog would be a 100% copy of an FAQ.
Intermediate & Advanced SEO | | khi51 -
Sitemap Issue - vol 2
Hello everyone! I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, . I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it! Thanks
Intermediate & Advanced SEO | | PremioOscar0 -
HTTPS Certificate Expired. Website with https urls now still in index issue.
Hi Guys This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated. So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one. So now we are basically sitting with the site urls, only being www... My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing. I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index. Can somebody please give me some advice on this. Thanks Dave
Intermediate & Advanced SEO | | daveza0 -
Dealing with close content - duplicate issue for closed products
Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.
Intermediate & Advanced SEO | | AymanH0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0