What strategies can you use when you're optimizing for 10 locations x 20+ services?
-
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks:
- Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state.
- Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title.
- Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice.
- Create new domains for each location/state covered. But then we have to start over building link juice.
How have other sites dealt with this? What has worked best and what hasn't worked?
-
Hi Adam,
My short and sweet answer to this scenario is:
A page for every city and a page for every service
So, you'd have a total of 30 pages to budget and plan for (one for each of the 10 cities and one for each of the 20 services).
Most small local businesses are not going to have the funding for developing 200 exceptional pages ... what I've seen when small businesses try to go this route of developing a page for every possible service/city combo is that they end up with a collection of so-so pages at best and at worst, thin or duplicate pages.
So, for a client like a dental practice, I believe that a sterling quality page for every city and for every service tends to be an achievable goal if structured over a reasonable time frame contract.
I definitely do not recommend developing a different website for each city. Build a powerhouse and keep working on improving it for the life of the business. Hope this helps!
-
Since nobody has responded I'll share what we are currently doing with only two locations and multiple services. It's number 3 on your list. The caveat here is that we're still implementing this so the final results are not in. Here is what we're doing:
- Make sure you have a Google+ business page for each physical location to make sure that Google knows you're "local" and you can pop-up on their location snippet (hopefully!).
- On the contact us page or locations page (not sure what you have), we list each location with the physical/mailing address, phone number and a link that says "Directions" that navigates to the "city-office" page (or however you want to name it... atlanta-office for example).
- On the city-office page we have a nice write-up about this city and the office. We also include a google map of the location, full address, phone numbers, email, and the associated Google+ profile link for that specific location. Now here is the magic: Below that we have a list that has a heading of "Local [city] Services" that has list of of each service that links to an optimized page for that city and service. For your client the heading might be "Local Atlanta Dental Services" for example. You want each service listed to have the appropriate keywords/phrases in the anchor text.
- Create each services page per location and optimize it like a pro. WARNING: this method will run the risk of duplicate content when you start having multiple cities with similar pages. It is therefore imperative that you make sure that each page contains unique content. The "Atlanta Teeth Whitening" page, although identical in nature with the "L.A. Teeth Whitening" page, must have content unique to their respective cities. This is where the opportunity presents itself to create 10x content for each city (https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday)
I suggest you start with one major city at a time, measure results, make any necessary adjustments and move on to the next city. The key here is that the content is unique for each service in each city. Sure, they can follow the same format, however make sure you put in the time to make each services page somewhat unique to that city. It may seem like a bit of a gray line that we're walking but, in my opinion, it's logical for expansion. Again the big risk is duplicate content but that can be avoided if done correctly.
Hopefully this helps! I would love to see others chime in on this and give feedback as I'm sure we're not the only ones in the world with this problem.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site still indexed after request 'change of address' search console
Hello, A couple of weeks ago we requested a change of address in Search console. The new, correct url is already indexed. Yet when we search the old url (with site:www.) we find that the old url is still indexed. Is there another way to remove old urls?
Intermediate & Advanced SEO | | conversal0 -
What can we do to optimize / be mobile-friendly for PDFs?
I'm getting a "Your page is not mobile-friendly." notice in the SERPs for all of our PDFs. I check the pdf on the phone and it appears just fine. rFtLq
Intermediate & Advanced SEO | | johnnybgunn0 -
Has anyone used Wildshark?
Just stumbled across a company called Wildshark SEO (http://www.wildshark.co.uk). Has anyone used them? Any feedback? Please and thank you!
Intermediate & Advanced SEO | | seoman100 -
Is a Rel Canonical Sufficient or Should I 'NoIndex'
Hey everyone, I know there is literature about this, but I'm always frustrated by technical questions and prefer a direct answer or opinion. Right now, we've got recanonicals set up to deal with parameters caused by filters on our ticketing site. An example is that this: http://www.charged.fm/billy-joel-tickets?location=il&time=day relcanonicals to... http://www.charged.fm/billy-joel-tickets My question is if this is good enough to deal with the duplicate content, or if it should be de-indexed. Assuming so, is the best way to do this by using the Robots.txt? Or do you have to individually 'noindex' these pages? This site has 650k indexed pages and I'm thinking that the majority of these are caused by url parameters, and while they're all canonicaled to the proper place, I am thinking that it would be best to have these de-indexed to clean things up a bit. Thanks for any input.
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Is this all that is needed for a 'canonical' tag?
Hello, I have a Joomla site. I have put in a plugin to make the page source show: eg. <link href="[http://www.ditalia.com.au/designer-fabrics-designer-fabric-italian-material-and-french-lace](view-source:http://www.ditalia.com.au/designer-fabrics-designer-fabric-italian-material-and-french-lace)" rel="<a class="attribute-value">canonical</a>" /> Is this all that is need to tell the search engines to ignore the any other links or indexed pages with a url which is created automatically by the system before the SEF urls are initiated?
Intermediate & Advanced SEO | | infinart0 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
On-page optimization - Am I doing it well?
Hi Mozzers, I'm sitting here going through our site and optimizing all of our content.
Intermediate & Advanced SEO | | Travis-W
For the most part we've just written without correct keyword research, so the content lacks focus. Here is a page I would consider finished - http://www.consumerbase.com/international-mailing-lists.html I have our KWs in the: URL Title Tag Meta Description Bolded in Content Image Alt Attribute. If I optimize my other pages like this, will I be good?
It feels a tiny bit stuffed to me, but SEOmoz's on-page tool gives me glowing numbers. Thanks!0 -
Have completed keyword analysis and on page optimization. What else can I do to help improve SERP ranking besides adding authoritative links?
Looking for concrete ways to continue to improve SERP results. thanks
Intermediate & Advanced SEO | | casper4340