Best way to set up a site with multiple brick and mortor locations across Canada
-
I have a client who is expanding his business locations from 2 cities to 3, and working towards having 10+ locations across Canada. Right now we're building location based landing pages for each city, as well as keyword targeted landing pages for each city. For example, landing pages for "Vancouver whatever clinic" and "Calgary whatever clinic" as well as for "Vancouver specific service", and "Calgary specific service". This means a lot of landing pages will need to be created to target each of 10 or so desirable "service" keywords for each city's location. I've no issue with this, however I was wondering how other companies go about this? What's the best way to be relevant for certain "service" based keyword searches in each city?
-
Many of the "service" keywords are 'localized' meaning they will show Google Places results for local brick and mortar businesses for each location. I'm quite good at optimizing locally for this type of thing. However, many of the "service" keywords are not yet 'localized' by Google, I'd want to have my client webpages show well in the SERP's. for these 'non-localized' "service keywords" as well.
-
the new site will be built in WordPress
-
-
If they are opening 3-10 physical locations then it is probably worth the extra funds to have a developer build an architecure to suit the site requirements rather than wordpress. It would be simple then to use 'state' from the landing page to scupltue the information presented on deeper pages to reflect the local version. So, if a user follow through to services page from the landing page they will see content from the main services page along with any local content required.
It is also important to look at pagespeed in generic molded designs to ensure it can meet the content delivery demands across a country as the site's traffic increase as new branches open.
-
I've no issue with this, however I was wondering how other companies go about this?
What's the best way to be relevant for certain "service" based keyword searches in each city?
These are the two questions you asked. Which I thought I answered.
Are you asking what is the suggested URL structure?
site.com/web-design-san-diego.html
Although this will work, I think I would prefer, and notice to work well
and optimize keywords accordingly.
If that still is not your question Then please re-type it as I am missing it. Sorry.
-
I don't think you understand what I'm asking, however I do appreciate your suggestions I know how to optimize, what I'm really looking for are site architecture suggestions that will help with the issue/s I noted in my original question.
-
Well some are easier than others. Website design is easy as the design links point back to the local page so you build PA for that keyword. Private tutoring also as you would have inbound links possibly from local schools, or personal social networks. I don't know anyone promoting ditch digging
The parent site needs to rank for website design, while the local page ranks for website design in [local place].The overall weight of the parent site will help the local site rank well.
Or maybe I do not understand what is meant by "However, many of the "service" keywords are not yet 'localized' by Google,"
-
"services" broadly speaking, keywords like "website creation", "ditch digging", "private tutoring"...
-
Please give examples of these keywords.
-
Thanks Richard
however... this is the tricky part:
I'm quite good at optimizing locally for this type of thing. However, many of the "service" keywords are not yet 'localized' by Google, I'd want to have my client webpages show well in the SERP's. for these 'non-localized' "service keywords" as well.
-
Landing page for each location is great! You also need to pint inbound links to those pages, and try to get those links from local sources. I believe you can have multiple Google Places pages for each physical location also.
You need to treat each landing page as their own site regarding link building, social networking, blogging, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Location for Copy Block
We are having discussions around the appropriate location to place the SEO copy block on an eCommerce category page. Would like to get the communities opinion to share with the creative team.
Web Design | | TukTown0 -
What is the Estimated Time for SERP Rankings to Replenish after a Site Redesign?
Hello Fellow Moz'ers, My company's website, www.1099pro.com, is currently OLD and not mobile-friendly! However, we rank #1 for out most important keywords and don't want to lose that ranking. I've recently redesigned our site, currently in testing, to use the same standard desktop pages but to also have responsive, mobile friendly, pages for different view ports. My question is if anyone knows an estimated time frame that search engines (mainly Google) takes to re-crawl the site and restore SERP rankings to their previous levels? The reason is because we are HIGHLY seasonal and if we are not back at our top rankings by early December, at latest (November would be better), then we stand the chance to lose a considerable amount of traffic/revenue. -The Unenlightened One
Web Design | | Stew2220 -
W3C My site has 157 Errors, 146 warning(s) Is it an issue?
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance? When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has? Your advice please Thanks Ash
Web Design | | AshShep10 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
Best strategy for multiple internal links
We have a website that has a whole heap of internal links. After re-structuring the footer links to the main categories we are now looking at the best way to display the internal links from each category (approx 20-50 links from each). At present they are simply listed on the corresponding category page, however this doesnt look great so we want to display them in a way that is user friendly and also seo friendly. Any suggestions and examples appreciated. Thanks in advance.
Web Design | | Silkstream0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
Can SEO Moz perform a full site crawl and provide a report showing all URLs within an existing domain?
We are conducting a site redesign and need to get an idea of all pages that are out there on our domain (in some report fashion). This would help for discovery and cleanup as we re-work the site and move to a new CMS. Thanks
Web Design | | DCondon0 -
How is an SEO's time best used?
We have over 50 highly varied and niche sites in our company. Each website is for an annual event spread across the calendar. I am the solo SEO person here and was wondering what your opinions are about what would bring in the greatest SEO power in my limited daily allotment; link building? Keywords? Content? Oh, and to make my life even easier - its all based on SharePoint 2007!
Web Design | | DaveGerecht0