Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google places VS position one ranking above the places.
-
Hi Guys,
Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword?
I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts?
Or would they have that listing as well as the places listing?
I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc.
Appreciate some guidance
Thanks.
BC
-
I have a client where we put the specific local listing page url (example.com/locations/phoenix/location1) in the Google Places URL field. It works out really well as we get the home page ranking organically (depending on the query) and the specific places result locally. Sometimes they are combined and other times they are not, but we are in the mix somewhere almost always.
-
Curious if anyone of you guys has experience pointing the places listing to a different URL other than the homepage?
I have read a few articles that stated various different outcomes, some mentioning that it didn't effect their Organic result, but was harder to rank the places URL. Just curious of findings!
-
Hi Bodie,
Yes, I think this is playing in the grey area. If the business owner actually wants to make his used and new car dealerships two companies with completely separate legal business names or DBAs, addresses with separate walk-in entrances, phone numbers and websites with completely unique content, then yes, you'd be talking about two different businesses, but that seems like an awful lot of real-world trouble to go to just to get a second Place page, eh? Chances are, a car dealership with both used and new cars is simply a single business with different specialties and should only be running a single website with a single Place/+ Local page.
What would happen if you went ahead with this plan, anyway, without the company actually being two legally separate entities? Honestly, you might be able to get away with it for awhile. Google is often not super sharp about upholding their policies and iffy stuff can ride for a long time. But...the risk is big. Should Google ever decide that they don't like what they are seeing, they could penalize or remove the listing from the index and if there is any association at all between the 2 listings, they could penalize the whole profile. This isn't a risk I would take for my clients, and for a business model like you're describing, like a car dealership, I would not advise the hypothetical approach you are considering. Rather, I would recommend that the client build the strongest local profile he can for his business and then consider other forms of marketing such as Social Media, Video Marketing, new content, development, etc. to continue to build additional visibility.
Hope this helps!
-
Think more along the lines of a car dealership with a 'NEW' and "used car' department?
would i be pushing it ? My question to you is how would the association be made between the pages and businesses if the new site was branded differently and had a new address and a unique non associated domain? The only way i can think is if they were interlinked, but many non associated sites are linked. Is this playing in a grey area?
Thanks again
-
Hi Bodie,
My pleasure. Are you stating that you work at a large business that has more than one front entry door for clientele (like a hospital with an emergency room and a separate radiology department?) If so, then you are allowed to create more than one listing for the business under the following Google Places Quality Guideline:
Departments within businesses, universities, hospitals, and government buildings may be listed separately. These departments must be publicly distinct as entities or groups within their parent organization, and ideally will have separate phone numbers and/or customer entrances.
If this is an accurate description of your business model, then I would simply have a single website with unique landing pages for the different public offices and tie these pages to the distinct Place Pages/+ Local Page for the business. Anything that doesn't really fit the above would not be a good idea.
I would not recommend associating an identical business name with two different websites and Place Pages if it is really the same business. What Google wants is for you to make a totally realistic representation of your business on the web; not to try to appear like you are larger, more diverse, or different than you really are in real life. I know how important it is to do all you can to gain the broadest visibility, but I believe that all efforts must be founded on an authentic presentation of any business, and this appears to be Google's view, too. Hope this helps!
-
Thanks for your response, would it be deemed black hat to set up a new site specifically for the Google places listing if it had a strong geo location in the URL and was attached to a different address?
ie website Hillarysrestaurant.com.au (ie hillarys is the suburb) and i was to register Perthrestaurant.com.au and attach that to a different address as the restaurant takes up 3 blocks ie 6-10 so i run the real website as it always was on 6 and set up the new site as a push site/squeeze page on 10 and use it just for google local?
i really hope this makes sense. Thanks again for your help and SEO wisdom!
P.s its not a restaurant im just using this as an example.
-
We have the same experience as Cody. Google Places is like ADDING another listing to the SERP. From what I understand the Google places, is supposed to rotate around. But your #1 or #2 spot should stay firm - unless you get knocked off by a competitor! We have several clients that are in #1, Google Places and then #4 or 5 - so it is possible to take up quite a bit of real estate on a SERP.
-
Hi BC,
Yes, you can typically expect the organic rank to be subsumed into the Places rank if you create a Google Places/+ Local page for the client. This is a very common outcome and it remains uncommon, though not impossible, for businesses to have more than one results per SERPs page.
-
I work with around 50 companies, and that's typically what I see. My #1 listing will just get changed to a Places listing, but it will still be in the #1 position.
-
In my experience, I had a client with the positioning like yours. We created the Places account and it just went into the local / maps results. The good news was that the SERP didn't contain any other organic listings at the top. If you have prominent and consistent rankings and are confident in your strategy, then you might not need to create a places account. Just be aware that moving down 1 spot could really be 8 or 9 spots on the real estate of the SERP. Moving down to #2 organically could mean being below the entire local results. You will need to judge the risk / rewards. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dfferent domains on same ip address ranking for the same keywords, is it possible?
Hello, I want to ask if two domains which r hosted on the same server and have the same ip ( usually happens with shared hosts ) tries to rank for the same keywords in google, does the same ip affects them or not.
White Hat / Black Hat SEO | | RizwanAkbar0 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Real Vs. Virtual Directory Question
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc. My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately. Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it? Thank you again for the help and looking forward to your thoughts!
White Hat / Black Hat SEO | | ClayPotCreative0 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0