Google Places
-
My client offers training from many locations within the UK.
These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places.
At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid.
What are your thoughts.
-
The fact they don't "own" the location doesn't matter. Many small businesses don't "own" the locations, they are leased. I'll bet the client in this case leases space to hold their training classes. It would be appropriate to to have a places listing for each location. In the addresses they can just create arbitrary suite numbers to indicate that they may not be the ONLY business in that "place."
-
Nice trick
-
This is something that interests me as well. One of my sites has a very similar setup to you, and I ahve considered doing the same (submitting all of the venues to Google Places with the comapny name and h/o phone number)
I have refrained from doing this so far though, and my reasoning is as follows. If the venue (in your case training location) is already registered will Google mind? Can you have multiple business registered at one address?
The second reason I've not done is that it feels a little spammy. The business doesn't necessarily own the venues (training locations) so why should you be listed for them?
I wonder how this works for serviced/shared offices?
-
They would use a Head Office telephone number, same for each listing.
I have seen other companies with multiple listing with the same telephone number, so I am presuming that Google alllow this.
-
Does your client have a specific phone number for each of this places ? If not, I'm not sure if you can register a place for each of their "venue".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
JavaScript encoded links on an AngularJS framework...bad idea for Google?
Hi Guys, I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites. We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links. My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently. The link anchor text simply says "Visit website" and the link is fired using JavaScript. Here's a snapshot of the code we're using: Visit website Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm? Thank you 🙂
White Hat / Black Hat SEO | | AU-SEO0 -
Homepage not ranking for branded searches after Google penalty removal
Hi all, A site I work on was hit with a manual action penalty some time ago for spammy links built by a former SEO agency. It was a partial match penalty so only affected some pages - most likely the homepage. We carried out a lot of work cleaning up links and disavowed suspicious links which we couldn't get removed. Again, most of these were to the homepage. The disavow file was uploaded to Google last Friday and our penalty was lifted this Tuesday. Since uploading the disavow file, our homepage does not show up at all for branded searches. I've carried out the obvious checks - robots.txt, making sure we're not accidentally noindexing the page or doing anything funky with canonicals etc and it's all good. Have any of you guys had a similar experience? I'm thinking Google simply needs time to catch up due to all the links we've disavowed and sitting tight is the best option but could do with some reassurance! Any past experiences or advice on what I might be missing would be great. Thanks in advance, Brendan.
White Hat / Black Hat SEO | | Brendan-Jackson1 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
SERP dropping along with competitors - Google algorithm mix up?
I am hoping someone will have some insight as our recent ranking drop has been driving me crazy trying to figure out what happened. Our site is www.dgrlegal.com. We've been building links by creating quality content and getting others to link to it. We've seen our rankings rise to 3 for a number of keywords. Suddenly around March we saw a pretty drastic drop but only for certain keywords (maybe a Penguin hit?). For example, "new jersey process service" still has us ranked 3rd but "new jersey process server" sees us much lower around 19. I've noticed several competitors have dropped while one has risen so is this negative SEO? Probably not as our backlink profile doesn't seem suspicious but it has me very confused. We've received no warnings or notices from Google. The only thing I see is that our indexed pages went from 13 to 98 in January and have been now steadily increasing to 129, although I thought this would be a positive. Any suggestions or thoughts? I thought maybe things would shake out but it hasn't happened as of yet - we just keep dropping.
White Hat / Black Hat SEO | | amandadgr0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Google Penguin for non-English queries?
Does anybody know if non-English queries were also 'hit' by the Google Penguin update? All Penguin horror stories out there are from sites focusing on English queries, and in some (Dutch) industries I'm monitoring, some sites with spammy backlink profiles are still ranking.
White Hat / Black Hat SEO | | RBenedict0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0 -
Penalized In Google ?
Hello Guy´s. Im terrible sad because we make an amazing SEO job for this client: www.medabcn.com And the website was hacked.. Message from the hosting platform: "It would appear that malicious individuals have found a way to upload spam
White Hat / Black Hat SEO | | maty
pages as well as backdoors to your site(s). We
have disabled the page(s) in question (via removing their permissions, e.g..
chmod) until you are able to address this matter." Result: we loose all our SERP Somebody of yours was in a similar situation ? Notes: I was on Google Webmaster an anything seem to be normal. The domain was relative new, maybe a late sandbox efect ? Thanks a lot for your help. Matias0