Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google places VS position one ranking above the places.
-
Hi Guys,
Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword?
I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts?
Or would they have that listing as well as the places listing?
I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc.
Appreciate some guidance
Thanks.
BC
-
I have a client where we put the specific local listing page url (example.com/locations/phoenix/location1) in the Google Places URL field. It works out really well as we get the home page ranking organically (depending on the query) and the specific places result locally. Sometimes they are combined and other times they are not, but we are in the mix somewhere almost always.
-
Curious if anyone of you guys has experience pointing the places listing to a different URL other than the homepage?
I have read a few articles that stated various different outcomes, some mentioning that it didn't effect their Organic result, but was harder to rank the places URL. Just curious of findings!
-
Hi Bodie,
Yes, I think this is playing in the grey area. If the business owner actually wants to make his used and new car dealerships two companies with completely separate legal business names or DBAs, addresses with separate walk-in entrances, phone numbers and websites with completely unique content, then yes, you'd be talking about two different businesses, but that seems like an awful lot of real-world trouble to go to just to get a second Place page, eh? Chances are, a car dealership with both used and new cars is simply a single business with different specialties and should only be running a single website with a single Place/+ Local page.
What would happen if you went ahead with this plan, anyway, without the company actually being two legally separate entities? Honestly, you might be able to get away with it for awhile. Google is often not super sharp about upholding their policies and iffy stuff can ride for a long time. But...the risk is big. Should Google ever decide that they don't like what they are seeing, they could penalize or remove the listing from the index and if there is any association at all between the 2 listings, they could penalize the whole profile. This isn't a risk I would take for my clients, and for a business model like you're describing, like a car dealership, I would not advise the hypothetical approach you are considering. Rather, I would recommend that the client build the strongest local profile he can for his business and then consider other forms of marketing such as Social Media, Video Marketing, new content, development, etc. to continue to build additional visibility.
Hope this helps!
-
Think more along the lines of a car dealership with a 'NEW' and "used car' department?
would i be pushing it ? My question to you is how would the association be made between the pages and businesses if the new site was branded differently and had a new address and a unique non associated domain? The only way i can think is if they were interlinked, but many non associated sites are linked. Is this playing in a grey area?
Thanks again
-
Hi Bodie,
My pleasure. Are you stating that you work at a large business that has more than one front entry door for clientele (like a hospital with an emergency room and a separate radiology department?) If so, then you are allowed to create more than one listing for the business under the following Google Places Quality Guideline:
Departments within businesses, universities, hospitals, and government buildings may be listed separately. These departments must be publicly distinct as entities or groups within their parent organization, and ideally will have separate phone numbers and/or customer entrances.
If this is an accurate description of your business model, then I would simply have a single website with unique landing pages for the different public offices and tie these pages to the distinct Place Pages/+ Local Page for the business. Anything that doesn't really fit the above would not be a good idea.
I would not recommend associating an identical business name with two different websites and Place Pages if it is really the same business. What Google wants is for you to make a totally realistic representation of your business on the web; not to try to appear like you are larger, more diverse, or different than you really are in real life. I know how important it is to do all you can to gain the broadest visibility, but I believe that all efforts must be founded on an authentic presentation of any business, and this appears to be Google's view, too. Hope this helps!
-
Thanks for your response, would it be deemed black hat to set up a new site specifically for the Google places listing if it had a strong geo location in the URL and was attached to a different address?
ie website Hillarysrestaurant.com.au (ie hillarys is the suburb) and i was to register Perthrestaurant.com.au and attach that to a different address as the restaurant takes up 3 blocks ie 6-10 so i run the real website as it always was on 6 and set up the new site as a push site/squeeze page on 10 and use it just for google local?
i really hope this makes sense. Thanks again for your help and SEO wisdom!
P.s its not a restaurant im just using this as an example.
-
We have the same experience as Cody. Google Places is like ADDING another listing to the SERP. From what I understand the Google places, is supposed to rotate around. But your #1 or #2 spot should stay firm - unless you get knocked off by a competitor! We have several clients that are in #1, Google Places and then #4 or 5 - so it is possible to take up quite a bit of real estate on a SERP.
-
Hi BC,
Yes, you can typically expect the organic rank to be subsumed into the Places rank if you create a Google Places/+ Local page for the client. This is a very common outcome and it remains uncommon, though not impossible, for businesses to have more than one results per SERPs page.
-
I work with around 50 companies, and that's typically what I see. My #1 listing will just get changed to a Places listing, but it will still be in the #1 position.
-
In my experience, I had a client with the positioning like yours. We created the Places account and it just went into the local / maps results. The good news was that the SERP didn't contain any other organic listings at the top. If you have prominent and consistent rankings and are confident in your strategy, then you might not need to create a places account. Just be aware that moving down 1 spot could really be 8 or 9 spots on the real estate of the SERP. Moving down to #2 organically could mean being below the entire local results. You will need to judge the risk / rewards. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why am i ranking great in canada but not in the US?
hi, some of my keywords with high volume are ranking on first page in canada, but in the states i am on 3rd pages first result. what factors are contributing this disparity. what can be done here in this case. is it because of my links and tld distribution or some server location thing. what should i do to rank better in the US? i have shared hosting server in singapore.
White Hat / Black Hat SEO | | Sam09schulz0 -
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
My indexed site URL removed from google search without get any message or Manual Actions???
On Agust 2 or 3.. I'm not sure about the exact date...
White Hat / Black Hat SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results I never get any penalties to my site on Google search console. I noticed some drops on some keywords before that happens (in June and July) but it all of it was related to web design keywords for local Qatar, but all other keywords that related to SEO and digital marketing were not have any changes and been on top My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Recently, based on google console, I found some new links but I have no idea how it been added to links of my website:
essay-writing-hub.com - 9,710
tiverton-market.co.uk - 252
facianohaircare.com - 48
prothemes.biz - 44
worldone.pw - 2
slashdot.org - 1
onwebmarketing.com - 1 the problem is that all my high PR real links deleted from google console as well although it still have my site link and it could be recognized by MOZ and other sites! Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0