Best way to set up a site with multiple brick and mortor locations across Canada
-
I have a client who is expanding his business locations from 2 cities to 3, and working towards having 10+ locations across Canada. Right now we're building location based landing pages for each city, as well as keyword targeted landing pages for each city. For example, landing pages for "Vancouver whatever clinic" and "Calgary whatever clinic" as well as for "Vancouver specific service", and "Calgary specific service". This means a lot of landing pages will need to be created to target each of 10 or so desirable "service" keywords for each city's location. I've no issue with this, however I was wondering how other companies go about this? What's the best way to be relevant for certain "service" based keyword searches in each city?
-
Many of the "service" keywords are 'localized' meaning they will show Google Places results for local brick and mortar businesses for each location. I'm quite good at optimizing locally for this type of thing. However, many of the "service" keywords are not yet 'localized' by Google, I'd want to have my client webpages show well in the SERP's. for these 'non-localized' "service keywords" as well.
-
the new site will be built in WordPress
-
-
If they are opening 3-10 physical locations then it is probably worth the extra funds to have a developer build an architecure to suit the site requirements rather than wordpress. It would be simple then to use 'state' from the landing page to scupltue the information presented on deeper pages to reflect the local version. So, if a user follow through to services page from the landing page they will see content from the main services page along with any local content required.
It is also important to look at pagespeed in generic molded designs to ensure it can meet the content delivery demands across a country as the site's traffic increase as new branches open.
-
I've no issue with this, however I was wondering how other companies go about this?
What's the best way to be relevant for certain "service" based keyword searches in each city?
These are the two questions you asked. Which I thought I answered.
Are you asking what is the suggested URL structure?
site.com/web-design-san-diego.html
Although this will work, I think I would prefer, and notice to work well
and optimize keywords accordingly.
If that still is not your question Then please re-type it as I am missing it. Sorry.
-
I don't think you understand what I'm asking, however I do appreciate your suggestions I know how to optimize, what I'm really looking for are site architecture suggestions that will help with the issue/s I noted in my original question.
-
Well some are easier than others. Website design is easy as the design links point back to the local page so you build PA for that keyword. Private tutoring also as you would have inbound links possibly from local schools, or personal social networks. I don't know anyone promoting ditch digging
The parent site needs to rank for website design, while the local page ranks for website design in [local place].The overall weight of the parent site will help the local site rank well.
Or maybe I do not understand what is meant by "However, many of the "service" keywords are not yet 'localized' by Google,"
-
"services" broadly speaking, keywords like "website creation", "ditch digging", "private tutoring"...
-
Please give examples of these keywords.
-
Thanks Richard
however... this is the tricky part:
I'm quite good at optimizing locally for this type of thing. However, many of the "service" keywords are not yet 'localized' by Google, I'd want to have my client webpages show well in the SERP's. for these 'non-localized' "service keywords" as well.
-
Landing page for each location is great! You also need to pint inbound links to those pages, and try to get those links from local sources. I believe you can have multiple Google Places pages for each physical location also.
You need to treat each landing page as their own site regarding link building, social networking, blogging, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Help, site traffic has dropped significantly since we changed from http to https
Heya, so I am just in charge of the content on the site, and the SEO content, not the actual back-end stuff. A little under 2 weeks ago we switched to https, and our site traffic has been down a lot ever since. When I SERP check our keywords, they don't seem to have dropped in rankings pages. Here is what I got when I asked our dev guy if 301 redirects were put in: I did not add any redirects so all of the content is accessible on both unless individual links get hardcoded one way or the other. The only thing in place is a Cloudflare plugin which rewrites links in cached pages to match the way its accessed, so if for example you access a page over https you don’t get the version cached with a bunch of http links since that will throw up mixed content warnings in the browser. Other than that WP mostly generates all its links to match whatever protocol you are accessing the current page with. We can make specific pages redirect one way or the other in the future if we want to though... As a startup, site traffic is a metric we track to gouge progress, and so I really need to get to the bottom of if it was the change from http to https that has causes the drop, and if so, what can we do about it? Also, in case it is relevant: the bounce rate is now sky high (ave. 15% to 64% this last week!) Any help is very welcome! Site: https://mobileday.com Thank you!
Web Design | | MobileDay1 -
Should Blog Category Archive URLs be Set to "No-Index" in Wordpress?
It appears that Google Webmaster Tools is listing about 120 blog archives URLs in Google Index>Index Status that should not be listed. Our site map contains 650 pages, but Google shows 860. Pages like: <colgroup><col width="464"></colgroup>
Web Design | | Kingalan1
| http://www.nyc-officespace-leader.com/blog/category/manhattan-office-space | With Titles Like: <colgroup><col width="454"></colgroup>
| Manhattan Office Space Archives - Metro Manhattan Office Space | Are listed when in the Rogerbot crawl report for the site. How can we remove such pages from Google Webmaster Tools, Index Status? Our site map shows about 650 pages, yet Google show these extra pages. We would prefer that they not be indexed. Note that these pages do not appear when we run a site:www.nyc-officespace-leader.com search. The site has suffered a drop in ranking since May and we feel it prudent to keep Google from indexing useless URLs. Before May 650 pages showed on the Webmaster Tools Index status, and suddenly in early June when we upgraded the site the index grew by about 175 pages. I suspect the 120 blog archives URLs may have something to do with it. How can we get them removed? Can we set them to "No-Index", or should the robot text be used to remove them? Or can some type of removal request be made to Google? My developers have been struggling with this issue since early June. The bloat on the site is about 175 URLs not on the site map. Is there any go to authority on this issue (it is apparently rather complicated) that can provide a definitive answer? Thanks!!
Alan0 -
Google tag manager on blocked beta site - will it phone home to Google and cause site to get indexed?
We want to develop a beta site, in a directory with the robots.txt blocking bots. We want to include the Google Tag Manager tags and event layer tracking code on this beta site. My question is that by including the Google Tag Manager code, that phones home to Google, will it cause Google to index this beta site when we don't want it indexed?
Web Design | | CFSSEO0 -
Avoiding duplicate content with multi-lagusage site
Hi, We have a client in China that is looking to create three versions of the same website, English, Chinese and Korean. They do not want to use a translation plugin like Google translate, preferring to have the pages duplicated. What is the best way to do this bearing in mind that the site needs to be found in all three languages. Would also appreciate if anyone knows of a good hosting company that has English support on the Chinese main land. Thanks Fraser
Web Design | | fraserhannah0 -
Will launching this site get my E-commerce site penalized?
Hello.. I am wondering if you guys think launching a site like this is a good or a bad idea. All of the links on it go directly to the exact corresponding page on the ecommerce site. Do you think Google will penalize my site for launching sites (i have many other domains that i will be setting up similar to this) like this? Thanks...
Web Design | | Prime850 -
H1 Tags with Location
Hi Everyone, I have a question about trying to get location information into title tags without having it look spammy. What I've been trying is something like this: h1 { font-size:18px; } .h1_sub { font-size:10px; } Why Choose My Company Minnesota Website Design | Minneapolis Web Development I'm not sure if that is a good thing to do or not as everything is inside the h1 tag and visible to the engines however it makes it less prominent on the page and gives a better layout. You can see it at www.mltgroup.com/company.php Thanks
Web Design | | MLTGroup0 -
Best Practices for web layout dimensions
Hello Moz community, I have my own ideas...but what are your opinions on best practices for landing page width size in pixels 900px 720px What is a common pixel height for "above the fold"...my target is North America
Web Design | | johnshearer0