NAP - is lack of consistency in address elements an issue?
-
I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London)
For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses).
So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not.
Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?
-
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
-
Hi Luke,
It's a complex topic. I think you'll find this Matt McGee article from SmallBusinessSEM and this one from Marcus Miller at Search Engine Land extremely helpful. Both talk about how to optimize multi-location businesses and very specifically about data consistency and does Google pay attention to slight variations like the one you described in your question where the addresses are never wrong, just "mixed up a little".
"... for the most part, the algo handles those minor discrepancies well. That being said, you don’t want to tempt fate."
-
Yes sorry it needed clarification - was struggling to describe the issue - what you suggest sounds like a good idea, indeed - I will put a complete NAP only at the top of each of the 8 main landing pages, in Schema, along with a calendar on each landing page linking to the class descriptions. Many thanks for your help with this - much appreciated
-
Ah, got it, Luke! Thanks for clarifying. It seems to me, then, that what you might need is some kind of a calendar on the main city landing page for each location that links to the different class descriptions. Would this be a way to format 38 different links so that customers can understand them easily and see what's available? Just a thought!
-
Hi Miriam - yes the 38 pages have been created about the services from each specific location (in this case health and fitness classes) - the classes are specific to that location, so each of the run of 38 pages are specific to a specific location, so there would a strong contextual relationship. Basically the 38 pages are specific to classes unique to that location (in terms of times, tutors and often type).
So I guess the whole idea of whether to do a specific footer for each locational section was what was humming around in my brain, with the specific address relevant to the content above, in the footer, rather than all 8 business locations consistently in the footer.
I was originally thinking of adding all 8 business addresses consistently in the footer, though I thought perhaps specific addresses may be more user friendly, and may even help Google understand the locational context.
-
Hi Luke,
Hmm ... that doesn't sound right to me. I may be missing something, but unless these 38 pages for each location have genuinely been created about the location and things relating specifically to it, I would not stick the NAP on there, just for the sake of putting it on a bunch of pages. What you're describing to me sounds like some kind of afterthought.
I also wouldn't change the footer around like that. It could create usability difficulties if it's changing throughout the site. Rather, my preference would be complete NAP only at the top of a single landing page per physical location, and NAP of all 8 businesses consistently in the sitewide footer. And, again, NAP of all 8 on the Contact page.This is what I consider to be the normal structure.
As for what to do with those several hundred pages, are they of really high quality? Are they city-specific or just generic to the business' topic? An example of city-specific might be something like a website for an arborist. He has a page for City A talking about how Dutch Elm Disease has hit that city. For City B, he has a page about birch tree borers that have affected that city's trees. So, from the main city A landing page, he could link to the Dutch Elm piece and for the main city B landing page, he could link to the birch borer page, as additional resources.
But if the content is just generic and you're trying to divvy it up between the cities, if there's not a strong contextual relationship, then there isn't really a good reason for doing so.
-
Hi Miriam,
What I meant is there are 8 business locations and the site's 300 odd pages are divided into these 8 (so each geographical location has around "38 pages" dedicated to that specific location and its services).
So what I was planning to do was simply put the correct location-specific NAP in the footer of each of the location-specific pages (so each run of location-specific "38 pages" will have the relevant [single] NAP in the footer of every page).
But my co-worker said only put the correct [single] NAP in the footer of the 8 location home(/landing) pages within the site, rather than on every page.
Hope that makes sense [it's been a long week ;-I]
-
(Miriam responding here, but signed into Mozzer Alliance right now)
Hi Luke,
If you mean in the footer and it's 10 or less locations, I'd say it's okay to put the NAP for the 8 businesses there, but not in the main body of the page.
My preferred method would be to put the complete NAP, in Schema, for Location A at the top of City Landing Page A, complete NAP for Location B at the top or City Landing Page B, etc. I would not suggest putting all of this NAP anywhere else on the site but the Contact Page.
-
Thanks Miriam - it sure does - their website is divided up by location, so I'm planning to put the relevant NAP at the bottom of every page through the website (8 locations and NAPs in total - 300 pages) - a colleague suggested just puting the NAP on each of the 8 location homepages though I suspect it would help more if the NAP was at foot of every page (so long as the correct NAP on the correct page ha!) - is that the right thing to do?
-
Hey Luke!
NAP consistency was judged to be the second most influential pack ranking factor on this year's Local Search Ranking Factors (https://moz.com/local-search-ranking-factors) so, yes, it's of major importance! Hope this helps.
-
When it comes to NAP, it should be as close to an exact match as you're able to achieve. Inconsistency in this area - while not the biggest detriment you can have - should be avoided.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issues Reported in Moz Crawl Report
Hi all, We have a lot of 'Duplicate Page Content' issues being reported on the Moz Crawl Report and I am trying to 'get to the bottom' of why they are deemed as errors... This page; http://www.bolsovercruiseclub.com/about-us/job-opportunities/ has (admittedly) very little content and is duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/explorer-of-the-seas-2015/ This page is basically an image and has just a couple of lines of static content. Also duplicated with; http://www.bolsovercruiseclub.com/cruise-lines/costa-cruises/costa-voyager/ This page relates to a single cruise ship and again has minimal content... Also duplicated with; http://www.bolsovercruiseclub.com/faq/packing/ This is an FAQ page again with only a few lines of content... Also duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/exclusive-canada-&-alaska-cruisetour/ Another page that just features an image and NO content... Also duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/free-upgrades-on-cunard-2014-&-2015/?page_number=6 A cruise deals page that has a little bit of static content and a lot of dynamic content (which I suspect isn't crawled) So my question is, is the duplicate content issued caused by the fact that each page has 'thin' or no content? If that is the case then I assume the simple fix is to increase add \ increase the content? I realise that I may have answered my own question but my brain is 'pickled' at the moment and so I guess I am just seeking assurances! 🙂 Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Robots.txt issue for international websites
In Google.co.uk, our US based (abcd.com) is showing: A description for this result is not available because of this site's robots.txt – learn more But UK website (uk.abcd.com) is working properly. We would like to disappear .com result totally, if possible. How to fix it? Thanks in advance.
Intermediate & Advanced SEO | | JinnatUlHasan0 -
Does hiding responsive design elements on smaller media types impact Google's mobile crawler?
I have a responsive site and we hide elements on smaller media types. For example, we have an extensive sitemap in the footer on desktop, but when you shrink the viewport to mobile we don't show the footer. Does this practice make Google's mobile bot crawler much less efficient and therefore impact our mobile search rankings?
Intermediate & Advanced SEO | | jcgoodrich1 -
Client site is lacking content. Can we still optimize without it?
We just signed a new client whose site is really lacking in terms of content. Our plan is to add content to the site in order to achieve some solid on-page optimization. Unfortunately the site design makes adding content very difficult! Does anyone see where we may be going wrong? Is added content really the only way to go? http://empathicrecovery.com/
Intermediate & Advanced SEO | | RickyShockley0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Grading issues on my weekly report
Hi just had my weekly report on my website to show me how i am ranking in the search engines and i am puzzled, i have not changed my page except from one paragraph and the page has gone from a grade a to a grade f. can anyone explain how this works out
Intermediate & Advanced SEO | | ClaireH-1848860 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0