Local SEO best practices for multiple locations
-
When dealing with local search for a business with multiple locations, I've always created an individual page for each location.
Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far.
But, the one thing you can always count on with Local is that the game keeps changing.
So I'd like to hear what you think... How do you deal with multiple locations these days?
Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now)
How does Schema.org fit in to your tactics in this area, if at all?
Cheers
(Edit: dear SEOmoz, stop eating my line breaks)
-
Greetings, Bede,
So sorry about the line breaks. We've been having a little problem with that, and I hope my response to you isn't one big paragraph
The process you have been using for your Local clients remains a best practice, in my experience. Not only does this separate the information out as best you can for bots, but it is a good practice for human users who will be signaled upon reach such a landing page that they are looking at information about their own geographic location. You are so right that things keep changing in Local, but this is one area in which creating unique, rich content landing pages is still a smart choice.
In regards to Schema vs., for instance, hCard, the benefit is that Google, Yahoo & Bing have all agreed on this markup as standard. However, definitely read this 2011 discussion at Mike Blumenthal's blog in which Mike opines that rushing to switch from hCard to Schema may not be your #1 top priority:
Through 2011, I continued to use hCard on my Local clients sites without issue, but it is certainly smart to acquaint yourself with the features of Schema that apply to Local. The tool linked to from that post is a great place to begin.
Hope my feedback is useful!
Miriam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript redirects -- what are the SEO pitfalls?
Q: Is it dangerous (SEO fallout) to use javascript redirects? Tech team built a browser side tool for me to easily redirect old/broken links. This is essentially a glorified 400 page -- pops a quick message that the page requested no longer exists and that we're automatically sending you to a page that has the content you are looking. Tech team does not have the bandwidth to handle this via apache and this tool is what they came up with for me to deliver a better customer experience. Back story: very large site and I'm dealing with thousands of pages that could/should/need to be redirected. My issue is incredibly similar to what Rand mentioned way back in a post from 2009: Are 404 Pages Always Bad for SEO? We've also decided to let these pages 404 and monitor for anything that needs an apache redirect. Tool mentioned above was tech's idea to give me "the power" to manage redirects. What do you think?
Technical SEO | | FR1230 -
Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
Hi - Over the past few months, I've identified multiple sites which are linking into my site and creating fake pages (below is an example and there's over 500K+ of similar links from various sites}. I've attempted to contact the hosting companies, etc. with little success. Was wondering if my best course of action might be at this point: A) which servers (or IP address). B) Use the Google Disavow tool? C) both. example: { http://aryafar.com/crossings/200-krsn-team-part19.html } Thanks!!
Technical SEO | | hhdentist0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
How To SEO Mobile Pages?
hello, I have finally put my first foot on the path of trying to learn and understand mobile SEO. I have a few questions regarding mobile SEO and how it works, so please help me out. I use wordpress for my site, and there is a nifty plugin called WP touch http://wordpress.org/extend/plugins/wptouch/ What it basically does is, it converts your desktop version into a mobile friendly version. I wanted to know that if it does that, does this mean whatever SEO i do for my regular web site gets accomplished for my moible version as well? Another simple question is, if i search for the same term on my mobile phone then on my desktop how different will the SERs be? thanks moz peeps
Technical SEO | | david3050 -
Mobile site: robots.txt best practices
If there are canonical tags pointing to the web version of each mobile page, what should a robots.txt file for a mobile site have?
Technical SEO | | bonnierSEO0 -
SEO LINKS
New to S.E.O. so excuse my naivety. I have made lots of new links some of them paid for e.g. Best of the Web but I don’t see any change in the latest competitive link analysis. Some of the links we have been accepted for just do not show. Also the keywords we are trying to promote the most have disappeared off the radar for over 2 weeks now. I think we have followed the optimization suggestions correctly. Please could you enlighten me. Regards Paul www.curtainpolesemporium.co.uk
Technical SEO | | CPE0 -
Is Adobe Acrobat the best for making PDF documents in terms of seo and price?
As we add PDF documents to our website, I want to take it up a notch. In terms of seo and software price, is Adobe Acrobat the only choice? Thanks! No Mac here. I should clarify that I can convert files to PDFs with Microsoft Word and add some basic info for the search engines such as title, keywords, author, and links. This article inspired me: www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search I can add links back to the page when I create the PDF, but we also have specific product PDFs that suppliers let us copy and serve from our server--why use their bandwidth. Much as you would stamp your name on a hard copy brochure the vendor supplies, I want to add a link to our page from those PDFs. That makes me think I should ask our supplier to give me a version with a link to our page. Then there is the question: is that ok to do? In the meantime, I will check TriviaChicken's suggestions and dream about a Mac, Allan. Thanks
Technical SEO | | zharriet0