Best way to do a site in various regions
-
I have a client who has 2 primary services in 4 regions
He does mold removal and water damage repair.
He then serves cincinnati, dayton, columbus, and indianapolis.
Before hiring my company he had like 30 domains (keyword based) and had tons and tons of fake google places listings. He actually got a lot of traffic that way.
However I will not tolerate that kind of stuff and want to do things the right way.
First of all what is the best site approach for this. He wants a site for each service and for each city.
indy mold
cincy mold
dayton mold
dayton water
etc etc etc
In the end he will have 8 sites and wants to expand into other services and regions.
I feel like this is not the right way to handle this as he also has another site that is more generic
To me the best way to do this is a generic domain with a locations page and a page for each city.
The for the Places he would get one account - an address that is hidden since he goes to customer locations, and just multiple city defined regions.
He does have an office like address at each city. So should I make him a Places listing for each city or just the one? And of course how should the actual sites be organized?
Thanks
-
I do wish you luck, Web Feat! Show the client the Google Places Quality Guidelines. Then, it's not just coming from you - it's coming from Google. Also, heads up: Google just changed the language again the guidelines. I wrote about this today, in case you are interested and it might help you in talking with the client:
http://www.solaswebdesign.net/wordpress/?p=1205
Have a good rest of your weekend!
-
Thanks so much. That is all pretty much what I was thinking. I now get to try to convince the business owners to do things the right way. Wish me luck
-
Hi WebFeat,
You've taken on 2 tough things here - a spammed record and a business owner who may not want to do things correctly. Hopefully, you will be able to help him see the light on this. Let me copy some of your remarks and respond to them individually, please, so that I'm sure I'm covering what you want to know.
Before hiring my company he had like 30 domains (keyword based) and had tons and tons of fake google places listings. He actually got a lot of traffic that way.
Bad, bad, bad. Yes...you can game the system, but this is the type of account that eventually gets banned and getting back into Google's good graces after that can be something even the best Local SEO on earth will be unable to accomplish. The client is in an emergency situation right now. If you can get the record cleaned up before punishment occurs, you're saving his neck.
However I will not tolerate that kind of stuff and want to do things the right way.
Thumbs up for you and 'boo' to the previous company who taught the client to engage in these practices.
In the end he will have 8 sites and wants to expand into other services and regions.
Some business owners do take this approach of having a separate site for each of their cities or services. The main argument for such a practice is that a) the exact match domain name can give a ranking boost and b) links coming into the domain will have matching primary keywords because of the URL. One can choose to do this, but I don't consider it a best practice for several reasons.
The first is that it hints at a single entity being multiple entities, which is not really true. The second is that it makes management (SEO, marketing, webmastering) incredibly complicated. The third is that I believe it is better to build the authority of a single domain with tons of great content and links than to spread this thin like a scrape of butter over a whole loaf of bread. Of these three statements, the last is really just my opinion on this as a Local SEO. It's not proven fact or anything like that, but like you, I just think it's not a best practice. I would advise my own client that if they've got a single legal business, that = a single really authoritative website they can develop.
To me the best way to do this is a generic domain with a locations page and a page for each city.
I'm not quite sure what you mean by a generic domain, but at any rate, I agree with you about a single domain with a landing page for each city and each service. Then, I would build beyond this on the website, perhaps with blogging, covering every service he offers in each city one at a time. This is somewhat akin to the process your client thinks is best - having a different domain for each city and each service in each city, but instead of doing this on a ton of different domains, it's all under one roof, under the name of his business, on a single, powerhouse website.
The for the Places he would get one account - an address that is hidden since he goes to customer locations, and just multiple city defined regions.
He does have an office like address at each city. So should I make him a Places listing for each city or just the one? And of course how should the actual sites be organized?
If the client has a REAL location (and you should be sure of this because what he's told you are locations could turn out just to be virtual offices) in each of his main cities, then yes, he is allowed to have a Place Page for each of his cities. If he has only one legit location, then he should have only one Place Page.
That's good that you're hip to the recent 'hide address' guideline changes. This is still early days with this. My interpretation of the new guideline, at this point, is this:
Type A
Your business is brick-and-mortar and serves all customers at its location. Show your address.Type B
Your business is home-based and serves some customers at your home and some on the road. Show your address and use the Service Radius tool.Type C
Your business is home-based and does not serve any customers at your home. Hide your address.*see http://www.seomoz.org/blog/why-you-may-need-to-hide-your-google-places-address-asap
You'll need to figure which one of those fits your client's business model. I would also recommend that you read this post on this subject by Mike Blumenthal which points out some of the vagueness of the guidelines that have yet to be adequately resolved by Google:
Google has really done a poor job with clarity on the new guideline. Hopefully, you can figure out the right move for your client on this and he will abide by your advice.
So, bottom line on this is that my professional preference would be for a single domain and whatever number of place pages matches the legitimate offices of the client. I would focus on building out content rather than building out domains.
That being said...if the client has built an empire of domains that are getting him business, it may be necessary to maintain those, but whatever he decides on that, he should be informed that by spamming Google Places he is risking his total visibility in Google. Additionally, if he ends up with just 1-2 Place Pages and 20 domains, it's going to be complicated deciding which URL the legit Place Page/Pages can point to. Again, another reason to put everything in one basket. If he wants to keep the domains, you can go with that, though it's not ideal, but insist on him cleaning up his act in Google Places or tell him you can't work with him as he's heading for a train wreck, sooner or later.
Hope this helps!
Miriam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site Metrics are not as per they should
Hi, I am regularly making links on my site to improve its metrics but i am confused how other people fastly improve their DA/PA and my DA/PA is not improving with that site. The same happened with spam score. It has been a month i disavow my links having spam score but instead of decrease in it, my spam score increased. Please advice. Is there any special way to use that help moz crawler to check site and update accordingly? Please help
Technical SEO | | AzadSeo37310 -
Site dropped from SERP
Hello, I've been ranking a site for the last 5 months with good success, ranking on the first page for a high traffic keyword. In the beginning of September however, my site completely dropped out of the SERPs for several of those keywords yet my site was still indexed and there was no penalty applied to my site via search console. I would assume this maybe because of the update during the time.My site came back again a week later and it was ranking much higher on the first page (#2). Today, I just checked the SERPs and my site is now gone again. It was there this morning but now as of two hours ago it is gone, as well as one of my main competitors. My site is still indexed and no penalties via search console. Does anyone know what causes these types of issues? Im assuming my site will come back in a week or so with hopefully the same or better ranking, but when I have disruptions like this it really hurts my organic traffic. Any input is appreciated. Thanks!
Technical SEO | | KathleenDC0 -
Cache Not Working on Our Site
We redesigned our site (www.motivators.com) back in April. Ever since then, we can't view the cache. It loads as a blank, white page but the cache text is at the top saying: "This is Google's cache of http://www.motivators.com/. It is a snapshot of the page as it appeared on Jul 22, 2013 15:50:40 GMT. The current page could have changed in the meantime. Learn more. Tip: To quickly find your search term on this page, press Ctrl+F or ⌘-F (Mac) and use the find bar." Has anyone else ever seen this happen? Any ideas as to why it's happening? Could it be hurting us? Advice, tips, suggestions would be very much appreciated!
Technical SEO | | Motivators0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
Best way to handle different views of the same page?
Say I have a page: mydomain.com/page But I also have different views: /?sort=alpha /print-version /?session_ID=2892 etc. All same content, more or less. Should the subsequent pages have ROBOTS meta tag with noindex? Should I use canonical? Both? Thanks!
Technical SEO | | ChatterBlock0 -
How much of an issue is it if a site is somehow connected to a site that was penalized by Google?
I am working with someone that is about to launch a new site, and one of the sites was affected by the Panda update. Does it matter if the two sites are connected? Share the same hosting provider and same Google Webmaster's account?
Technical SEO | | nicole.healthline0 -
301 redirect on the root of the site
Due to some historic difficulties with our URL Rewriter, we are in the position of having the root of our site 301 redirected to another page. So the root of our site: http://www.propertylive.co.uk/ has a 301 redirect to: http://www.propertylive.co.uk/home.aspx We're aware that this isn't great and we're working to fix this completely, but what impact will this have on our SEO?
Technical SEO | | LianWard860