Attracting custom from 3 cities - Is this the best way to optimize?
-
Hi, I'm working for a client that draws custom from 3 nearby cities - I was thinking of creating a new page for 2 of the cities, reachable from within the website and not simply doorway pages.
Each new page would include (1) General info (2) info relevant to the city in question, if relevant to client - perhaps well-known customers already coming from the city in question (3) transport from the city - directions.
Is it OK to do this, or could Google see it as manipulative seeing that business is not geographically located in all 3 cities (in actual fact the business is in just one location, within the official borders of one city, in another city for some administrative services and 40 miles away from the third).
Thanks in advance, Luke
-
Hi Luke, This is a common practice for service area businesses (like plumbers, electricians, carpet cleaners, etc.) who are located in one city, but serve clients within a larger radius beyond their location city limits. It sounds like what you are describing is a bit different - a client to whom customer come from a variety of cities. I do not believe Google would have any problem with what you are doing, provided that you follow what you've planned to do in making the content for these city pages unique. I think a nice thing to do on these pages would be to add some testimonials from customers who come to the business from these other locations. Now, whether these pages will greatly impact your client's ability to rank well for the service+city keywords is up in the air. It really depends on the competitiveness of the industry and locale. If the client is in a situation of modest competition, these new pages could achieve some new visibility and drive some new, qualified traffic, but if the client is in a dog-eat-dog vertical, the new pages may not be able to be of big help. It's really one of those 'it depends' situations. Bottom line, though, it is not my experience that Google views such content as manipulative in intent if the content has a real reason for existing.
-
It's only OK to do this if the client has an address in each city. If each site does not have a crawlable NAP, you risk getting hit with dupe content.
Also, if you want be future proof, make sure the copy is as unique as possible along with unique meta tags.
One method that works really well for my clients, is using subdomains for different cities of pages.
Example:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
Correct way to block search bots momentarily... HTTP 503?
Hi, What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site) How can you block the bots for like 30 mins or so? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Is Best of the Web a good directory to pay to be listed on?
We are currently paying to have a listing in the directory Best of the Web. Should I be paying to renew our listing in this directory?
White Hat / Black Hat SEO | | djlittman0 -
In (or In-between) 2 cities - and mentioning both cities in title tags
Hi, just wondering what your thoughts are on this one - several businesses I work for are located in in-between places. For example, one is in one city for its address, but in another city's council (/state) area. Another is in a rural area, almost exactly the same distance between 2 cities (about 10 miles either way). Both businesses mention both cities on several pages of their websites, including in title tags (including homepage title tags), and it seems to be working OK in terms of rankings (ie they're ranking well for keyphrases for both cities). Is it acceptable practice to mention both cities in a single title tag though? That's my question. (some of this confusion dates back to UK local authority boundary/name changes, in 2009)
White Hat / Black Hat SEO | | McTaggart0 -
SEO for location outside major city
Hello, I'm hoping to get some opinions on optimising a site for a client based 30 minutes outside of Dublin. Obviously there is a higher search volume for "x in Dublin" than "x in small town". What do you think the best strategies are for incorporating the "Dublin" into keywords? For example is it OK to use phrases like "x near Dublin" or "x in Greater Dublin", or do you think this is a bit misleading? The client in question sells good online, so the customer wouldn't physically have to visit the store. Thanks!
White Hat / Black Hat SEO | | gcdtechnologies0 -
EMD with 3.3million broad match searches got hit hard by Panda/Penguin
k, so I run an ecommerce website with a kick ass domain name. 1 keyword (plural)
White Hat / Black Hat SEO | | SwissNinja
3.3 million broad match searches (local monthly)
3.2 million phrase match
100k exact match beginning of march I got a warning in GWT about unnatural links. I feel pretty certain its a result of an ex-employee using an ALN listing service to drip spun article links on splogs. This was done also for another site of mine, which received the same warning, except bounced back much sooner (from #3 for EMD w/ 100k broad, 60k phrase and 12k exact, singular keyword phrase) I did file reinclusion on the 2nd (smaller) domain. Received unnatural warning on 4/13 and sent reconsideration on 5/1 (tune of letter is "I have no clue what is up, I paid someone $50 and now Im banned) As of this morning, I am not ranking for any of my terms (had boucned back on main keyword to spot #30 after being pushed down from #4) now back to the interesting site....
this other domain was bouncing between 8-12 for main keyword (EMD) before we used ALN.
Once we got warning, we did nothing. Once rankings started to fall,we filed reinclusion request...rankings fell more, and filed another more robustly written request (got denials within 1 week after each request)until about 20 days ago when we fell off of the face of the earth. 1- should I take this as some sort of sandbox? We are still indexed, and are #1 for a search on our domain name. Also still #1 in bing (big deal) 2- I've done a detailed analysis of every link they provide in GWT. reached out to whatever splog people I could get in touch with asking them to remove articles. I was going to file another request if I didn't reappear after 31 days after I fell off completely. Am I wasting my time? there is no doubt that sabatoge could be committed by competition by blasting them with spam links (previously I believed these would just be ignored by google to prevent sabatoge from becoming part of the job for most SEOs) Laugh at me, gasp in horror with me, or offer some advice... I'm open to chat and would love someone to tell me about a legit solution to this prob if they got one thanks!0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1