Google + Local Pages
-
Hi,
If I have a company with multipul addresses, Do I create separate Google + page for each area?
-
Hmm, what about a G+ business page that is not verified? ie if I want to post something on a G+ community as the service company, I can not do that without a company page... Is it really best to delete the company G+ page? Lets say wee add a G+ button to our site, that links to the G+ page, correct? Do we not have the same privileges other sites do because we are a service?
-
Hi Bryan, If you have a Google+ BUSINESS page, yes, Google is saying to delete it now, and yes, manage multi locations via the old Places dash. For more on this deletion thing, read this: http://productforums.google.com/forum/#!category-topic/business/technical-issue/MI5yJcP1pJ8 It won't affect every business, but it will affect service radius businesses who built a Google+ Business page. Take care!
-
Hi Miriam,
Thanks for the response.
In short if we have a multi location business it's best to use the old places dashboard, and if it's a services only niche, such as a cleaning service, it's best to delete the Google + places page. Correct?
-
Hi Bryan,
Unfortunately, at this point, multi-location businesses are still not being supported by the new Google+ Local system. There is no way to link multiple Google+ Local pages within a single brand and you should not attempt to create a Google+ Local page for each location or to try to merge them. Clark has linked you to an early article on this from Mike Blumenthal from around the time that Google rolled out Google+ Local, but read one of Mike's more recent posts on the huge amount of issues going on with this before you make a move in any direction:
http://blumenthals.com/blog/2012/11/28/having-issues-with-your-glocal-social-merge-my-advice-dont/
So, for multi-location businesses, you can still create a listing for each unique address via the old Places dashboard, but should not try to create a Google+Local page for each of them. At some point, this will be possible, but it hasn't happened yet, so you will need to stay tuned into the news on this which is constantly being updated. Just last week, for example, Google announced that all service radius businesses have to delete their Google+ Business page (if they have one). This is a brand new rule that will affect countless businesses adversely if they don't happen to hear about it. Point being: with all things related to the whole Google+ system and especially with Google+ Local, you've got to stay on top of the news on a daily basis.
Hope this helps!
-
This is a post from Blumenthal on the subject --> http://blumenthals.com/blog/2012/08/03/step-by-step-guide-to-the-google-businesslocal-merge-verification-process/
This is from Google Product Forums --> http://productforums.google.com/forum/#!category-topic/business/cLdShAg9xYs[1-25-false]
In short yes, I have found you will need to create seperate pages for each location. This is a pain in my opinion and is something I wish G+ would clean up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will google be able to crawl all of the pages given that the pages displayed or the info on a page varies according to the city of a user?
So the website I am working for asks for a location before displaying the product pages. There are two cities with multiple warehouses. Based on the users' location, the product pages available in the warehouse serving only in that area are shown. If the user skips location, default warehouse-related product pages are shown. The APIs are all location-based.
Intermediate & Advanced SEO | | Airlift0 -
Why isn't Google caching our pages?
Hi everyone, We have a new content marketing site that allows anyone to publish checklists. Each checklist is being indexed by Google, but Google is not storing a cached version of any of our checklists. Here's an example:
Intermediate & Advanced SEO | | Checkli
https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach Missing Cache:
https://webcache.googleusercontent.com/search?q=cache:DfFNPP6WBhsJ:https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach+&cd=1&hl=en&ct=clnk&gl=us Why is this happening? How do we fix it? Is this hurting the SEO of our website.0 -
Google Panda question Category Pages on e-commerce site
Dear Mates, Could you check this category page of our e-commerce site: http://tinyurl.com/zqjalng and give me your opinion about, this is a Panda safe page or not? Actually I have this as NOINDEX preventing any Panda hit, but I'm in doubt. My Question is "Can I index this page again in peace?" Thank you Clay
Intermediate & Advanced SEO | | ClayRey0 -
Why is my page not showing in Google results
Hi, My website chka.org is showing up in Google but this page is not : http://www.chka.org/kickboxing-classes-nyc/ I cannot figure it out why. I submitted in manually to be crawled and it showed up for a day or two and then it disappeared again. The website is not copy pasted, it has unique content.
Intermediate & Advanced SEO | | leokadiarapczynska0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0 -
Deep Page is Ranking for Main Keyword, But I Want the Home Page to Rank
A deep page is ranking for a competitive and essential keyword, I'd like the home page to rank. The main reasons are probably: This specific page is optimized for just that keyword. Contains keyword in URL I've optimized the home page for this keyword as much as possible without sacrificing the integrity of the home page and the other keywords I need to maintain. My main question is: If I use a 301 redirect on this deep page to the home page, am I risking my current ranking, or will my home page replace it on the SERPs? Thanks so much in advance!
Intermediate & Advanced SEO | | ClarityVentures0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610