Best way to target multiple geographic locations
-
Hello Mozzers!
If you are a service provider wanting to target geographic locations outside of the region where you're physically located, what's the best approach?
For example, I have a service provider whose main market is not where they're located - they're based in Devon UK, yet main markets are London, Birmingham, Newcastle, Edinburgh. They have clients in all these cities, so I could definitely provide content relevant to each city - perhaps a page for each city detailing work and services (and possibly listing clients).
However, does the lack of a physical presence (and local phone number) in these cities make such city pages virtually impossible to rank these days? Does Google require a physical presence/phone number?
Thanks in advance, Luke
-
It sure was right on target too Miriam - a superb piece - can't think of any further questions
-
So glad it was timely, Luke!
-
Thanks Miriam - that's fantastic timing
-
Hi Luke,
We've just published a Moz blog post on precisely this topic. Please check it out:
http://moz.com/blog/local-landing-pages-guide
Please, give it a read-through, and then if any of your questions haven't been answered by it, please let me know what they are. Hope you'll find it right on-target!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should be the SEO strategy for a very big target?
Currently I am doing SEO of an Arabic website. I need to optimize it for GCC region. Its target is very big i.e. 1 million unique visitors per month (organic). The domain is new means there is no domain authority right now. What should be the best strategy in this scenario?
Intermediate & Advanced SEO | | sohail10 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Interlinking sites in multiple languages
I am working on a project where the client has a main .com site and the following additional sites which are all interlinked: .com site targeting US
Intermediate & Advanced SEO | | rachelmanning888
.com site targeting China
.HK site targeting Hong Kong All sites contain similar information (although the Chinese site is translated). They are not identical copies but being shopping sites, they contain a lot of similar product information. Webmeup software (now defunct) showed that the inbound links to the main site, from the additional domains are considered risky. Linkrisk shows them as neutral. The client wants them to be interlinked and would not want to remove the additional domains as they get a good amount of traffic. In addition, the messages and products for each country domain have been tailored to a degree to suit that audience. We can rewrite the content on the other domains, but obviously this is a big job. Can anyone advise if this would be causing a problem SEO wise and if so, is the best way to resolve it to rewrite the content on the US and Hong Kong sites? Alternatively would it be better to integrate the whole lot together (they will soon be rebuilding the main site, so it would be an appropriate time to do this).0 -
One Website, Multiple Locations, One Blog?
There's definitely not going to be a "right" answer to this question, but I think it can lead to a great discussion. We are building a website for a client who has two locations, we are going to use a URL structure similar to this: www.Brand.com (this would be a landing page where users would select a location) www.Brand.com/Atlanta www.Brand.com/Boston However, we still want to focus on local SEO - so our deeper URL structure will be: www.Brand.com/Atlanta/Auto-Accident-Lawyer www.Brand.com/Atlanta/Motorcycle-Accident-Lawyer www.Brand.com/Boston/Auto-Accident-Lawyer www.Brand.com/Boston/Motorcycle-Accident-Lawyer The content on those pages will be unique and target local keywords. Each "version" of the website will have a navigation specific to that location. For example, once a user clicks into the Boston website, all of the navigation items will pertain to Boston. However, we run into an issue with the blog. Both locations will be using the same blog content, which ends up looking something like this: www.Brand.com/Atlanta/Blog/Blog-Article www.Brand.com/Boston/Blog/Blog-Article This obviously creates duplicate content. We could do something such as this: www.Brand.com/Blog/Blog-Article However, as noted above, each local version of the website has a separate navigation (this keeps a user in Boston on the Boston version of the website). So have a centralized blog is far from ideal unless navigations for both locations are included - which would allow users to return back to their local website. From my understanding, duplicate content doesn't necessarily "hurt" your SERPs, it simply keeps one of the duplicated pages from ranking. So the question comes down to this, is duplicate content a big enough issue to restructure a website to use a centralized blog?
Intermediate & Advanced SEO | | McFaddenGavender0 -
Best Tool For Finding Related Keywords?
What is the best tool for finding related keywords to the primary keyword we are targetting? Cheers
Intermediate & Advanced SEO | | webguru20140 -
Optimizing Product Catalogs for Multiple Brick & Mortar Locations
We're working on a project for a retail client who has multiple (5+) brick and mortar store locations in a given geographical area. They're regional, so they have locations in multiple states. We're optimizing their content (coupons, events, products, etc) across their site, but we're running into the issue of ranking well for specific products in one location, but not as well (or not at all) in others. The keywords we would like to rank for generally aren't super competitive, we're dealing with commodity products in local retail markets, so in most cases, good on page optimization is enough to rank in the top couple results. Our current situation: (specific examples are fictitious but representative) Title: My Company | Dogwood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…
Intermediate & Advanced SEO | | cballinger
Url: http://mycompany.com/catalog/product/dogwood-trees The content on the page is generally well optimized. We've claimed all the locations in Google places and we've deployed schema.org markup for each location that carries the item on the product page. We have specific location pages that rank well for Company name or Company Name Location, but the actual goal is to have the product page come up in each location. In the example above, we would rank #1 for "Dogwood Trees Fredericksburg VA" or just "Dogwood Trees" if the searcher is in or around Fredericksburg, on the first page for "Dogwood Trees Rocky Mt, NC", but not at all for any other locations. As these aren't heavily linked to pages, this indicates the title tag + on page content is probably our primary ranking factor, so as Google cuts the keyword relevance at the tail of the title tag, the location keywords stop helping us. What is the proper way to do this? A proposed solution we're discussing is subfolder-ing all the locations for specific location related content. For Example: My Company | Dog wood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…http://mycompany.com/catalog/product/dogwood-trees Becomes: My Company | Dogwood Trees - Fredericksburg, VA
http://mycompany.com/fredericksburg-va/product/dogwood-trees My Company | Dogwood Trees - Rocky Mt, NC
http://mycompany.com/rocky-mt-nc/product/dogwood-trees My Company | Dogwood Trees - Rock Hill, SC
http://mycompany.com/rock-hill-sc/product/dogwood-trees Of course, this is the definition of duplicate content, which concerns me, is there a "Google approved" way to actually do this? It's the same exact tree being sold from the same company in multiple locations. Google is essentially allowing us to rank well for whichever location we put first in the title tag, but not the others. Logically, it makes complete sense that a consumer in Rock Hill, SC should have the same opportunity to find the product as one in Fredericksburg, VA. In these markets, the client is probably one of maybe three possible merchants for this product within 20 miles. As I said, it's not highly competitive, they just need to show up. Any thoughts or best practices on this would be much appreciated!2 -
Google Places: Multiple company listings. How to rank the HQ page over a branch location.
Hi Moz experts! I have a client with Google Place listings for multiple branch locations and for some reason the fully SEO optimized Head Office listing is being beaten by an un-optimized branch listing. The HQ listing gets a tonne of traffic where as the ranking and unoptimized branch location doesn't and is the main listing when searching through Google. Any help would be greatly appreciated. Thanks
Intermediate & Advanced SEO | | Jon_bangonline1 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0