What strategies can you use when you're optimizing for 10 locations x 20+ services?
-
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks:
- Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state.
- Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title.
- Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice.
- Create new domains for each location/state covered. But then we have to start over building link juice.
How have other sites dealt with this? What has worked best and what hasn't worked?
-
Hi Adam,
My short and sweet answer to this scenario is:
A page for every city and a page for every service
So, you'd have a total of 30 pages to budget and plan for (one for each of the 10 cities and one for each of the 20 services).
Most small local businesses are not going to have the funding for developing 200 exceptional pages ... what I've seen when small businesses try to go this route of developing a page for every possible service/city combo is that they end up with a collection of so-so pages at best and at worst, thin or duplicate pages.
So, for a client like a dental practice, I believe that a sterling quality page for every city and for every service tends to be an achievable goal if structured over a reasonable time frame contract.
I definitely do not recommend developing a different website for each city. Build a powerhouse and keep working on improving it for the life of the business. Hope this helps!
-
Since nobody has responded I'll share what we are currently doing with only two locations and multiple services. It's number 3 on your list. The caveat here is that we're still implementing this so the final results are not in. Here is what we're doing:
- Make sure you have a Google+ business page for each physical location to make sure that Google knows you're "local" and you can pop-up on their location snippet (hopefully!).
- On the contact us page or locations page (not sure what you have), we list each location with the physical/mailing address, phone number and a link that says "Directions" that navigates to the "city-office" page (or however you want to name it... atlanta-office for example).
- On the city-office page we have a nice write-up about this city and the office. We also include a google map of the location, full address, phone numbers, email, and the associated Google+ profile link for that specific location. Now here is the magic: Below that we have a list that has a heading of "Local [city] Services" that has list of of each service that links to an optimized page for that city and service. For your client the heading might be "Local Atlanta Dental Services" for example. You want each service listed to have the appropriate keywords/phrases in the anchor text.
- Create each services page per location and optimize it like a pro. WARNING: this method will run the risk of duplicate content when you start having multiple cities with similar pages. It is therefore imperative that you make sure that each page contains unique content. The "Atlanta Teeth Whitening" page, although identical in nature with the "L.A. Teeth Whitening" page, must have content unique to their respective cities. This is where the opportunity presents itself to create 10x content for each city (https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday)
I suggest you start with one major city at a time, measure results, make any necessary adjustments and move on to the next city. The key here is that the content is unique for each service in each city. Sure, they can follow the same format, however make sure you put in the time to make each services page somewhat unique to that city. It may seem like a bit of a gray line that we're walking but, in my opinion, it's logical for expansion. Again the big risk is duplicate content but that can be avoided if done correctly.
Hopefully this helps! I would love to see others chime in on this and give feedback as I'm sure we're not the only ones in the world with this problem.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am looking for an SEO strategy
I am looking for an SEO strategy, a step by step procedure to follow to rank my website https://infinitelabz.com . Can somebody help?
Intermediate & Advanced SEO | | KHsdhkfn0 -
Should I redirect a domain we control but which has been labeled 'toxic' or just shut it down?
Hi Mozzers: We recently launched a site for a client which involved bringing in and redirecting content which formerly had been hosted on different domains. One of these domains still existed and we have yet to bring over the content from it. It has also been flagged as a suspicious/toxic backlink source to our new domain. Would I be wise to redirect this old domain or should I just shut it down? None of the pages seem to have particular equity as link sources. Part of me is asking myself 'Why would we redirect a domain deemed toxic, why not just shut it down.' Thanks in advance, dave
Intermediate & Advanced SEO | | Daaveey0 -
Manual action penalty revoked, rankings still low, if we create a new site can we use the old content?
Scenario:
Intermediate & Advanced SEO | | peteboyd
A website that we manage was hit with a manual action penalty for unnatural incoming links (site-wide). The penalty was revoked in early March and we're still not seeing any of our main keywords rank high in Google (we are found on page 10 and beyond). Our traffic metrics from March 2014 (after the penalty was revoked) - July 2014 compared to November 2013 - March 2014 was very similar. Question: Since the website was hit with a manual action penalty for unnatural links, is the content affected as well? If we were to take the current website and move it to a new domain name (without 301 redirecting the old pages), would Google see it as a brand new website? We think it would be best to use brand new content but the financial costs associated are a large factor in the decision. It would be preferred to reuse the old content but has it already been tarnished?0 -
Ecombuffet.com are offering a Rescue Review focused on Panda - Penguin and identifying issues. Has anyone used this service or aware of the organisation in general?
http://www.ecombuffet.com/rescue-review.htm . I have 2 sites that have definitely been hit by penguin and getting worse so am thinking of paying for this service as nothing I do seems to stop the slide (more like a plummet). Any comments welcome.
Intermediate & Advanced SEO | | Shaann1 -
Preferred domain can't set in Web master Tool
I have put my domain name as xxxxxtours.com without www in web master tool. i have redirect to www version using htaccess file .So I wanna put Preferred domain "Display urls as www.xxxxtours.com .When trying it give error as attached image.but i have verified site the .waiting for expert help . Ar5qx.png
Intermediate & Advanced SEO | | innofidelity0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
When is it worth re-structuring your site?
I recently started working on a site that is 8 years old and the currently URLs/ site structure is not SEO friendly. We are concerned that in re-structuring the site, we may loose our rankings. Has anyone ever completely re-structured their site? Was it worth it?
Intermediate & Advanced SEO | | nicole.healthline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0