Google Places
-
My client offers training from many locations within the UK.
These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places.
At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid.
What are your thoughts.
-
The fact they don't "own" the location doesn't matter. Many small businesses don't "own" the locations, they are leased. I'll bet the client in this case leases space to hold their training classes. It would be appropriate to to have a places listing for each location. In the addresses they can just create arbitrary suite numbers to indicate that they may not be the ONLY business in that "place."
-
Nice trick
-
This is something that interests me as well. One of my sites has a very similar setup to you, and I ahve considered doing the same (submitting all of the venues to Google Places with the comapny name and h/o phone number)
I have refrained from doing this so far though, and my reasoning is as follows. If the venue (in your case training location) is already registered will Google mind? Can you have multiple business registered at one address?
The second reason I've not done is that it feels a little spammy. The business doesn't necessarily own the venues (training locations) so why should you be listed for them?
I wonder how this works for serviced/shared offices?
-
They would use a Head Office telephone number, same for each listing.
I have seen other companies with multiple listing with the same telephone number, so I am presuming that Google alllow this.
-
Does your client have a specific phone number for each of this places ? If not, I'm not sure if you can register a place for each of their "venue".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Google cached another site, not mine?
Hi Guys, please help me. I need your help regarding my business website i.e. https://www.kamagratablets.com/. Before 8-10 days it was ranked in top 10 from home page but I lost my position and ranking page also changed by Google. If you will check caching of this website then you will see Google cache another site - http://www.hiphoptoptower.com/ - I have checked my code and nothing found related to this website. Please check and help me on this point, how can I remove this site from caching and get my previous ranking in Google.
White Hat / Black Hat SEO | | Devtechexpert0 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Google Panda and Penguin "Recovery"
We're working with a client who had been hit by Google Panda (duplicate content, copyright infringement) and Google Penguin (poor backlinks). While this has taken a lot of time, effort and patience to eradicate these issues, it's still been more than 6 months without any improvement. Have you experienced longer recovery periods? I've seen sites perform every black hat technique under the sun and still nearly 2 years later..no recovery! In addition many companies I've spoken to advised their clients to begin right from the very beginning with a new domain, site etc.
White Hat / Black Hat SEO | | GaryVictory0 -
Why Link Spamming Website Coming on First Page Google?
As we all already know about link spamming. As per Google Guidelines Link building, Exact Keywords Anchor Link Building is dead now but i am looking most of the website coming on first page in Google doing same exact keywords linking. I think directory, article, social bookmarking, press release and other link building activity is also dead now. Matt always saying content is more important but if we will not put any keywords link in content part then how website rank in first page in Google. Can anybody explain why is website coming on first page because when i am doing same activity for quality links with higher domain authority website then we are affected in Google update.
White Hat / Black Hat SEO | | dotlineseo0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Are there any "legitimate" paid links in Google's eyes?
The news about paid link campaigns is so frequent, that I have to ask the question....does Google allow any paid links? Aside from SEO, paid links can have visibility value. Much like an exit sign on the highway, the paid link says "Get off here"
White Hat / Black Hat SEO | | bcmull0