Local Listing Question
-
We will be starting local SEO efforts on a medical practice that has 4 locations & 15 doctors each location (so 60 listings total). I will submit each doctor & each location to InfoGroup, LocalEze, Axciom & Factual. Also, I will only submit each location (not doctors) to Google. The problem I'm seeing is the fact that each listing would have the same exact phone number - it all goes to one main routing center. What kind of problems could come of this? Do we need a separate phone numbers for each of the four locations (at the very least)?
-
Miriam, you are a rockstar! I just want to add my two cents.
Most citation sources find your listing by phone number so putting more then one phone number per location will generally be rejected or merged by those sources. I strongly agree that each of your locations should have a separate number. Your NAP (name, address phone number) consistency is also a ranking factor. (Here is Miriam's post about local seo ranking factors http://moz.com/blog/top-20-local-search-ranking-factors-an-illustrated-guide)
If you are building citations for each of your doctors, I recommend separate numbers for them too (but it's not required). I use a company called ifbyphone.com. They have a basic service plan at $49 per month and $2/month per phone number plus minutes. You have them forward to one number if you want but it's a good way to get around the issue. That's about $200/month plus minutes but you can use these numbers for multiple things like marketing too (ie AdWords, Billboards, Radio Commercials, etc)
That being said, Google Maps said
- "Some doctors may share the same office address with other doctors. If the listings have different doctor names, they are not duplicates, even if they have the same phone number. The same goes for lawyers, insurance agents, etc." https://support.google.com/mapmaker/answer/1731387?hl=en
The reason I recommend different numbers is for third party citation sources. If you can justify the $200 per month expense, I would highly recommend using separate numbers for each doctor. You'll be able to build strong rankings that way. I always worry about Google changing it's policy in this area so I think that separate numbers is a better idea.
-
Hi JohnWeb12,
Sounds like an exciting project! So, here's the deal. The guidelines you are dealing with are these:
- _Individual practitioners may be listed individually as long as those practitioners are public-facing within their parent organization. Common examples of such practitioners are doctors, dentists, lawyers, and real estate agents. The practitioner should be directly contactable at the verified location during stated hours. A practitioner should not have multiple listings to cover all of his or her specializations. _
- _Departments within businesses, universities, hospitals, and government buildings may be listed separately. These departments must be publicly distinct as entities or groups within their parent organization, and ideally will have separate phone numbers and/or customer entrances._See: https://support.google.com/places/answer/107528?hl=en
-
The separate 4 locations MUST have unique phone numbers, because they are going to be associated with unique physical addresses.
-
With the practitioners, this is a subject surrounded by grey area. I highly recommend that you read the discussion on Mike Blumenthal's post in which there are many interesting points raised surrounding this very topic in the comments:
I recommend you read the entire thread to pick up on some of the nuances surrounding the doctors having an identical phone number and the concerns about merging. Mike's advice on this:
"I think having the same phone number is OK by the rules but unfortunately the algo may merge the listings. I have some that are the same that have stayed intact and others that have merged."
My feeling is that each of the 4 offices should have a unique phone number, and that the guidelines do not require that each doctor have his own within the practice, but that there is some risk of merging if they don't. The guidelines are not 100% clear on this, as is the case with many points of order surrounding Google+ Local. I hope the post I've linked to will help you consider the ins-and-outs of this topic!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical 301 question
Howdy all, this has been bugging me for a while and I wanted to know the communities ideas on this. We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon) We are ranking better within google.com than we do on google.co.uk probably down to our TLD. Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? Many thanks and hope this isn't too complicated! Best wishes,
Intermediate & Advanced SEO | | TVFurniture
Chris0 -
Question about Syntax in Robots.txt
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
Intermediate & Advanced SEO | | DRSearchEngOpt
Disallow: /attachment_id Where "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!0 -
To merge or not to merge? That is the question.
I am planning to do something I never did, and I am wondering if it's really a good idea or not. I have four websites, all of the same company, each one with a different domain and different content: one has been the main official site for 16 years, 200 unique per month, indexed for 134 keywords, Domain Authority 17, 13 linking root domains one has been used as the main site from 2003 to 2006, it's focused on a specific business they actually discontinued, still online, no update since 2006, 500 unique per month, indexed for 92 keywords, Domain Authority 13, 8 linking root domains another has been a built on 2010 and maintained for less than year, and it's focused on a business they never really started, still online, no update since 2010, 3000 unique per month, indexed for 557 keywords, Domain Authority 25, 84 linking root domains a fourth one has been also built on 2010 and focused on a business never really started, still online, no update since 2010, 100 unique per month, indexed for 4 keywords, Domain Authority 6, 3 linking root domains Each website has traffic and links, all links being natural, they never tried to gain links in any way, they never did on page optimization, they never ever thought about SEO. They are not event interlinked. So, my idea is to merge all of them, putting websites 2, 3 and 4 as subfolders of the main site and replicating the old content there. Because those sites have traffic, incredibly one of the abandoned sites has 3000 unique per month, while the main site just 200! My doubts are: does it make sense to merge everything from a SEO prospective? A part from doing 301 correctly, what else should I be careful to do or not to do? website number 4 it's really outdated, content and structure is not easy to merge with the rest, traffic is really small, is it worth spending the time to merge it? Finally I also have a problem; customer didn't want to merge them, they agreed to, but they don't want visitors of the main site to be able to navigate to the old ones, so once moved and redirected I would have to put them in the sitemap of the main site but avoid linking to them on the actual "main" site. As far as I know google crawler doesn't like to find pages in sitemaps which are not reachable through a linking path on the website, is that correct? Is that going to make all the merging work useless? Should I convince the client to at least put small links in the footer or on a page linked from the footer?
Intermediate & Advanced SEO | | max.favilli0 -
Redirection question
How would I redirect this URL: http://www.members.mysite.com/ to this URL: http://www.mysite.com/ ? I cant figure it out
Intermediate & Advanced SEO | | JohnPeters0 -
Google Places Question: Two Businesses, Same Address
I am working with a client who runs a personal training business. He shares a fitness studio with another personal trainer to minimise costs. My issue is that the other guy has 'taken' the Google Places listing address as his business, rather than my client's. The gym itself is not a business, it is simply a shared workspace by two personal trainers - in the same way as a shared office space might be the address of several businesses. This presents a bit of a problem with Google Places verification. Is it best to: 'Alter' the address slightly so it appears to be a separate premises (e.g. 51 Something Street --> 51A Something Street) then use that address in all my citations Leave the address itself the same, but rely on the fact that there are separate domains, phone numbers and business names Any thoughts on this?
Intermediate & Advanced SEO | | Pretty-Klicks1 -
.htaccess question/opinion/advice needed
Hello, I am trying to achieve 3 different things on my .htaccess I just want to make sure I am doing it the right or best way because I don't have much experience working on this kind of files. I am trying to: a) Redirect www.mysite.com/index.html to www.mysite.com so I don't get a duplicate content/tag error. b) Redirect mysite.com to www.mysite.com c) Get rid of the file extensions; www.mysite.com/stuff.html to www.mysite.com/stuff This is the code that I'm currently using and it seems to work fine, however I would like someone with experience to take a look so I can avoid internal server errors and other kinds of issues. I grabbed each piece of code from different posts and tutorials. Options +FollowSymlinks
Intermediate & Advanced SEO | | Eblan
RewriteEngine on Index Rewrite RewriteRule ^index.(htm|html|php) http://www.mysite.com/ [R=301,L] RewriteRule ^(.*)/index.(htm|html|php) http://www.mysite.com/$1/ [R=301,L] RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule ^(.*)$ $1.html Options +FollowSymlinks
RewriteEngine on
Rewritecond %{http_host} ^mysite.com [nc]
Rewriterule ^(.*)$ http://www.mysite.com/$1 [r=301,nc] Thanks a lot!0 -
Mission Possible? You have 3 hours to do Local SEO. Which top 5 sites do you go Social Bookmark, Local Search Engine Submit and Directory List.
Mission Possible? Here is a test. Suppose you had 3 hours (okay 7) to go and submit links, etc, on Social Bookmarking, Local Search Engines and Directories, which top 5 or more of each would you do? (Assuming your on-page is already sweetened). I just got 2 more clients and I need to get started on a few things for each. Thankful for all your advice.............
Intermediate & Advanced SEO | | greenhornet770 -
Question about HTTP Vary for Mobile
I'm reviewing https://developers.google.com/webmasters/smartphone-sites/redirects, and wondering where exactly to add HTTP Vary: Desktop request which has a mobile page to add “Vary: User-Agent” to the response HEADER Or if the request came from mobile device, than add “Vary: User-Agent” to the response HEADER
Intermediate & Advanced SEO | | nicole.healthline0