Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google places VS position one ranking above the places.
-
Hi Guys,
Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword?
I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts?
Or would they have that listing as well as the places listing?
I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc.
Appreciate some guidance
Thanks.
BC
-
I have a client where we put the specific local listing page url (example.com/locations/phoenix/location1) in the Google Places URL field. It works out really well as we get the home page ranking organically (depending on the query) and the specific places result locally. Sometimes they are combined and other times they are not, but we are in the mix somewhere almost always.
-
Curious if anyone of you guys has experience pointing the places listing to a different URL other than the homepage?
I have read a few articles that stated various different outcomes, some mentioning that it didn't effect their Organic result, but was harder to rank the places URL. Just curious of findings!
-
Hi Bodie,
Yes, I think this is playing in the grey area. If the business owner actually wants to make his used and new car dealerships two companies with completely separate legal business names or DBAs, addresses with separate walk-in entrances, phone numbers and websites with completely unique content, then yes, you'd be talking about two different businesses, but that seems like an awful lot of real-world trouble to go to just to get a second Place page, eh? Chances are, a car dealership with both used and new cars is simply a single business with different specialties and should only be running a single website with a single Place/+ Local page.
What would happen if you went ahead with this plan, anyway, without the company actually being two legally separate entities? Honestly, you might be able to get away with it for awhile. Google is often not super sharp about upholding their policies and iffy stuff can ride for a long time. But...the risk is big. Should Google ever decide that they don't like what they are seeing, they could penalize or remove the listing from the index and if there is any association at all between the 2 listings, they could penalize the whole profile. This isn't a risk I would take for my clients, and for a business model like you're describing, like a car dealership, I would not advise the hypothetical approach you are considering. Rather, I would recommend that the client build the strongest local profile he can for his business and then consider other forms of marketing such as Social Media, Video Marketing, new content, development, etc. to continue to build additional visibility.
Hope this helps!
-
Think more along the lines of a car dealership with a 'NEW' and "used car' department? would i be pushing it ? My question to you is how would the association be made between the pages and businesses if the new site was branded differently and had a new address and a unique non associated domain? The only way i can think is if they were interlinked, but many non associated sites are linked. Is this playing in a grey area? Thanks again
-
Hi Bodie,
My pleasure. Are you stating that you work at a large business that has more than one front entry door for clientele (like a hospital with an emergency room and a separate radiology department?) If so, then you are allowed to create more than one listing for the business under the following Google Places Quality Guideline:
Departments within businesses, universities, hospitals, and government buildings may be listed separately. These departments must be publicly distinct as entities or groups within their parent organization, and ideally will have separate phone numbers and/or customer entrances.
If this is an accurate description of your business model, then I would simply have a single website with unique landing pages for the different public offices and tie these pages to the distinct Place Pages/+ Local Page for the business. Anything that doesn't really fit the above would not be a good idea.
I would not recommend associating an identical business name with two different websites and Place Pages if it is really the same business. What Google wants is for you to make a totally realistic representation of your business on the web; not to try to appear like you are larger, more diverse, or different than you really are in real life. I know how important it is to do all you can to gain the broadest visibility, but I believe that all efforts must be founded on an authentic presentation of any business, and this appears to be Google's view, too. Hope this helps!
-
Thanks for your response, would it be deemed black hat to set up a new site specifically for the Google places listing if it had a strong geo location in the URL and was attached to a different address?
ie website Hillarysrestaurant.com.au (ie hillarys is the suburb) and i was to register Perthrestaurant.com.au and attach that to a different address as the restaurant takes up 3 blocks ie 6-10 so i run the real website as it always was on 6 and set up the new site as a push site/squeeze page on 10 and use it just for google local?
i really hope this makes sense. Thanks again for your help and SEO wisdom!
P.s its not a restaurant im just using this as an example.
-
We have the same experience as Cody. Google Places is like ADDING another listing to the SERP. From what I understand the Google places, is supposed to rotate around. But your #1 or #2 spot should stay firm - unless you get knocked off by a competitor! We have several clients that are in #1, Google Places and then #4 or 5 - so it is possible to take up quite a bit of real estate on a SERP.
-
Hi BC,
Yes, you can typically expect the organic rank to be subsumed into the Places rank if you create a Google Places/+ Local page for the client. This is a very common outcome and it remains uncommon, though not impossible, for businesses to have more than one results per SERPs page.
-
I work with around 50 companies, and that's typically what I see. My #1 listing will just get changed to a Places listing, but it will still be in the #1 position.
-
In my experience, I had a client with the positioning like yours. We created the Places account and it just went into the local / maps results. The good news was that the SERP didn't contain any other organic listings at the top. If you have prominent and consistent rankings and are confident in your strategy, then you might not need to create a places account. Just be aware that moving down 1 spot could really be 8 or 9 spots on the real estate of the SERP. Moving down to #2 organically could mean being below the entire local results. You will need to judge the risk / rewards. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Real Vs. Virtual Directory Question
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc. My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately. Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it? Thank you again for the help and looking forward to your thoughts!
White Hat / Black Hat SEO | | ClayPotCreative0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Can one business operate under more than one website?
Is it possible for a business to rank organically for the same keyword multiple times with different web addresses? Say if I sell car keys and I wanted to rank for "buy new car keys" and I set up two different website say ibuycarkeys.com and carkeycity.com and then operate under both of these, would Google frown upon this?
White Hat / Black Hat SEO | | steve2150 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0