Attracting custom from 3 cities - Is this the best way to optimize?
-
Hi, I'm working for a client that draws custom from 3 nearby cities - I was thinking of creating a new page for 2 of the cities, reachable from within the website and not simply doorway pages.
Each new page would include (1) General info (2) info relevant to the city in question, if relevant to client - perhaps well-known customers already coming from the city in question (3) transport from the city - directions.
Is it OK to do this, or could Google see it as manipulative seeing that business is not geographically located in all 3 cities (in actual fact the business is in just one location, within the official borders of one city, in another city for some administrative services and 40 miles away from the third).
Thanks in advance, Luke
-
Hi Luke, This is a common practice for service area businesses (like plumbers, electricians, carpet cleaners, etc.) who are located in one city, but serve clients within a larger radius beyond their location city limits. It sounds like what you are describing is a bit different - a client to whom customer come from a variety of cities. I do not believe Google would have any problem with what you are doing, provided that you follow what you've planned to do in making the content for these city pages unique. I think a nice thing to do on these pages would be to add some testimonials from customers who come to the business from these other locations. Now, whether these pages will greatly impact your client's ability to rank well for the service+city keywords is up in the air. It really depends on the competitiveness of the industry and locale. If the client is in a situation of modest competition, these new pages could achieve some new visibility and drive some new, qualified traffic, but if the client is in a dog-eat-dog vertical, the new pages may not be able to be of big help. It's really one of those 'it depends' situations. Bottom line, though, it is not my experience that Google views such content as manipulative in intent if the content has a real reason for existing.
-
It's only OK to do this if the client has an address in each city. If each site does not have a crawlable NAP, you risk getting hit with dupe content.
Also, if you want be future proof, make sure the copy is as unique as possible along with unique meta tags.
One method that works really well for my clients, is using subdomains for different cities of pages.
Example:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
What are effective ways of finding people to link to my blog post?
So I spent ages creating amazing content and have loads of interest in it from my social media and people visiting my site are reading deep into it. I have so far not been able to get anyone to link to it. What am I doing wrong???
White Hat / Black Hat SEO | | Johnny_AppleSeed0 -
Killed by penguin 3
So with the update to penguin 3.0 last week we notice that some clients have been significantly hit by the update. How do we rectify the situation for the poor links that are on the site. We have used open site explorer and Google webmaster to try and identify which are the bad links to try and remove. Now we can spot that some inbound links are from directories that may be perceived as low value/spam, but could not be sure what is affecting the ranking. The vast majority of these links are historical prior to inheriting this client recently and so do not have any logins to remove the links (if there are logins). These appear to be placed by teams outsourced in India. We would suspect that no site owner would spend the time removing links from the site any way. How do we recover from the penguin hit. Is it just a case of trying to identify ones that we suspect could be perceived as spam and ask for these to be disavowed by Google? Do we contact all the sites to ask them to be removed and/or do we just push ahead with more engaging white hat methods of social SEO? Are we likely to recover in the short term or be permanently hit. The site is for a small business with no more than 800 monthly hits so this fall from grace off very good front page positions is going to hit our client very hard even if the sins are from a previous business. Any thoughts and suggestions PLEASE HELP
White Hat / Black Hat SEO | | smartcow0 -
What's the right way to gain the benefits of an EMD but avoid cramming the title?
Hi Guys, Say I'm (completely hypothetically) building weddingvenuesnewyork.com and right now I'm organizing the tags for each page. What's the best layout so that I can optimize for "wedding venues new york" as much as possible without it becoming spammy. Right now I'm looking at something like "Wedding Venues New York: Wedding Receptions and Ceremony Venues" for the title.. To get other strong keywords in there too. Is there a better layout/structure?.. And is having the first words of the title on the homepage the same as the domain name going to strengthen the ranking for that term, or look spammy to Google and be a bad move? This is a new site being built
White Hat / Black Hat SEO | | xcyte0 -
Abused seo unintentionally, now need a way out
Hello, I have been in contact with a smo to optimize my site for search engines and social media sites. my site was doing great from last 4 years. but suddenly it started dropping in ranking. then i came and joined seomoz pro to find a way out. i was suggested to categories content in form of subdomains ... well that put a huge toll on my rankings.. thanks to suggestions here i have 301 them to sub directories. Now another huge question arises. i found out that my smo guy was taking artificial votes or whatever youc all them on twitter, facebook and g+ ...twitter and facebook's are understandable but i am getting to think that these votings on g+ might have affected my site's ranking ? here is a sample url http://www.designzzz.com/cutest-puppy-pictures-pet-photography-tips/ if you scroll below you will see 56 google plus 1s... now the big question is, i have been creating genuince content. but nowt hat i am stuck in this situation, how to get out of it ? changing urls will be bad for readers.. will a 301 will fix it ? or any other method. thanks in advance
White Hat / Black Hat SEO | | wickedsunny10 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
How do you optimize a page with Syndicated Content?
Content is syndicated legally (licensed). My questions are: What is the best way to approach this situation? Is there any a change to compete with the original site/page for the same keywords? Is it okay to do so? Will there be any negative SEO impact on my site?
White Hat / Black Hat SEO | | StickyRiceSEO0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
White Hat / Black Hat SEO | | Melia0