NAP - is lack of consistency in address elements an issue?
-
I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London)
For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses).
So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not.
Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?
-
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
-
Hi Luke,
It's a complex topic. I think you'll find this Matt McGee article from SmallBusinessSEM and this one from Marcus Miller at Search Engine Land extremely helpful. Both talk about how to optimize multi-location businesses and very specifically about data consistency and does Google pay attention to slight variations like the one you described in your question where the addresses are never wrong, just "mixed up a little".
"... for the most part, the algo handles those minor discrepancies well. That being said, you don’t want to tempt fate."
-
Yes sorry it needed clarification - was struggling to describe the issue - what you suggest sounds like a good idea, indeed - I will put a complete NAP only at the top of each of the 8 main landing pages, in Schema, along with a calendar on each landing page linking to the class descriptions. Many thanks for your help with this - much appreciated
-
Ah, got it, Luke! Thanks for clarifying. It seems to me, then, that what you might need is some kind of a calendar on the main city landing page for each location that links to the different class descriptions. Would this be a way to format 38 different links so that customers can understand them easily and see what's available? Just a thought!
-
Hi Miriam - yes the 38 pages have been created about the services from each specific location (in this case health and fitness classes) - the classes are specific to that location, so each of the run of 38 pages are specific to a specific location, so there would a strong contextual relationship. Basically the 38 pages are specific to classes unique to that location (in terms of times, tutors and often type).
So I guess the whole idea of whether to do a specific footer for each locational section was what was humming around in my brain, with the specific address relevant to the content above, in the footer, rather than all 8 business locations consistently in the footer.
I was originally thinking of adding all 8 business addresses consistently in the footer, though I thought perhaps specific addresses may be more user friendly, and may even help Google understand the locational context.
-
Hi Luke,
Hmm ... that doesn't sound right to me. I may be missing something, but unless these 38 pages for each location have genuinely been created about the location and things relating specifically to it, I would not stick the NAP on there, just for the sake of putting it on a bunch of pages. What you're describing to me sounds like some kind of afterthought.
I also wouldn't change the footer around like that. It could create usability difficulties if it's changing throughout the site. Rather, my preference would be complete NAP only at the top of a single landing page per physical location, and NAP of all 8 businesses consistently in the sitewide footer. And, again, NAP of all 8 on the Contact page.This is what I consider to be the normal structure.
As for what to do with those several hundred pages, are they of really high quality? Are they city-specific or just generic to the business' topic? An example of city-specific might be something like a website for an arborist. He has a page for City A talking about how Dutch Elm Disease has hit that city. For City B, he has a page about birch tree borers that have affected that city's trees. So, from the main city A landing page, he could link to the Dutch Elm piece and for the main city B landing page, he could link to the birch borer page, as additional resources.
But if the content is just generic and you're trying to divvy it up between the cities, if there's not a strong contextual relationship, then there isn't really a good reason for doing so.
-
Hi Miriam,
What I meant is there are 8 business locations and the site's 300 odd pages are divided into these 8 (so each geographical location has around "38 pages" dedicated to that specific location and its services).
So what I was planning to do was simply put the correct location-specific NAP in the footer of each of the location-specific pages (so each run of location-specific "38 pages" will have the relevant [single] NAP in the footer of every page).
But my co-worker said only put the correct [single] NAP in the footer of the 8 location home(/landing) pages within the site, rather than on every page.
Hope that makes sense [it's been a long week ;-I]
-
(Miriam responding here, but signed into Mozzer Alliance right now)
Hi Luke,
If you mean in the footer and it's 10 or less locations, I'd say it's okay to put the NAP for the 8 businesses there, but not in the main body of the page.
My preferred method would be to put the complete NAP, in Schema, for Location A at the top of City Landing Page A, complete NAP for Location B at the top or City Landing Page B, etc. I would not suggest putting all of this NAP anywhere else on the site but the Contact Page.
-
Thanks Miriam - it sure does - their website is divided up by location, so I'm planning to put the relevant NAP at the bottom of every page through the website (8 locations and NAPs in total - 300 pages) - a colleague suggested just puting the NAP on each of the 8 location homepages though I suspect it would help more if the NAP was at foot of every page (so long as the correct NAP on the correct page ha!) - is that the right thing to do?
-
Hey Luke!
NAP consistency was judged to be the second most influential pack ranking factor on this year's Local Search Ranking Factors (https://moz.com/local-search-ranking-factors) so, yes, it's of major importance! Hope this helps.
-
When it comes to NAP, it should be as close to an exact match as you're able to achieve. Inconsistency in this area - while not the biggest detriment you can have - should be avoided.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
Hreflang implementation issue
We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products. The code used for “hreflang” tag execution, on a product page, being: These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below: <loc>http://www.example.com/my/product_name</loc> <lastmod>2017-06-20</lastmod> Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market. If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.
Intermediate & Advanced SEO | | Starcom_Search0 -
How to switch brand domain and address previous use of domain
We recently acquired a new domain to replace existing as it better fits our brand. We have little/no organic value on existing domain so switching is not an issue. However the newly acquired domain was previously used in a different industry and has inbound links with significant spam scores. How can we let Google know that these links are not valid for our business and start rebuilding reputation of the domain? Disavow tool?
Intermediate & Advanced SEO | | Marlette0 -
Why isn't the Google change of address tool working for me?
Last night I switched my site from http to https. Both sites are verified in Webmaster Tools but when I try to use the change of address it says- Your account doesn't contain any sites we can use for a change of address. Add and verify the new site, then try again. How do I fix this?
Intermediate & Advanced SEO | | EcommerceSite0 -
Rankings disappeared on main 2 keywords - are links the issue?
Hi, I asked a question around 6 months ago about our rankings steadily declining since April of 2013. I did originally reply to that topic a few days ago, but as it's so old I don't think it's been noticed. I'm posting again here, if that's an issue I'm happy to delete. Here it is for reference: http://moz.com/community/q/site-rankings-steadily-decreasing-do-i-need-to-remove-links Since the original post, I have done nothing linkbuilding-wise except posting blog posts and sharing them on Facebook, G+ and Twitter. There are some links in there which don't look great (ie spammy seo directories, which I'm sending removal requests to) although quite a lot of others are relevant. Here's my link profile: <a rel="nofollow" target="_blank">http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com</a> I've tried to make the site more accessible - we now have a simple, responsive design and I've tried to make the content clear and concise. In short, written for humans rather than search engines. As of the end of November, 'nuts and bolts' has now disappeared completely, and 'bolts and nuts' is page 8. There are many pages much higher which are not as relevant and have no links. We still rank highly for more specialised terms - ie 'bsw bolts' and 'imperial bolts' are still page 1, but not as high as before. We get an 'A' grade on the on-page grader for 'nuts and bolts, and most above us get F. I was cautious about removing links as our profile doesn't seem too bad but it does seem as if it's that. There are a fair few questionable directories in there, no doubt about that, but our overall practice in recent years has been natural building and link earning. So - I've created a spreadsheet and identified the bad links - ie directories with any SEO connotations. I am about to submit removal requests, I thought two polite requests a couple of weeks apart prior to disavowing with Google. But am I safe to disavow straight away? I say this as I don't think I'll get too many responses from those directories. I am also gradually beefing up the content on the shop pages in case of any 'thin content' issues after advice on the previous post. I noticed 100s of broken links in webmaster tools last week due to 2 broken links on our blog that repeated on every page and have fixed those. I have also been fixing errors W3C compliance-wise. Am I right to do all this? Can anyone offer any suggestions? I'm still not 100% sure if this is Panda, Penguin or something else. My guess is Penguin, but the decline started in March 2013, which correlates with Panda. Best Regards and thanks for any help, Stephen
Intermediate & Advanced SEO | | stephenshone0 -
HTTPS Certificate Expired. Website with https urls now still in index issue.
Hi Guys This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated. So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one. So now we are basically sitting with the site urls, only being www... My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing. I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index. Can somebody please give me some advice on this. Thanks Dave
Intermediate & Advanced SEO | | daveza0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Using rel=alternate hreflang element on ccTLDs
We have multilingual websites with some content variations but 60% of the content on site remains the same. Is it still advisable to use:rel=alternate hreflang option on ccTLDs when ccTLDs are in itself strong signal for Google to display result in respective countries 1. example.com 2. example.co.uk 3. example.co.jp 4. example.de
Intermediate & Advanced SEO | | CyrilWilson0