Organic listing & map listing on 1st page of Google
-
Hi, Back then, a company could get multiple listings in SERP, one in Google Maps area and a homepage or internal pages from organic search results.
But lately, I've noticed that Google are now putting together the maps & organic listings.
This observation has been confirmed by a couple of SEO people and I thought it made sense, but one day I stumble with this KWP "bmw dealership phoenix" and saw that www.bmwnorthscottsdale.com has separate listing for google places and organic results.
Any idea how this company did this?
Please see the attached image
-
Thanks man!
-
If a site can build up enough trust signals with their on-site SEO, off-site references to the site itself, and also optimize for local listings, it is possible to get found in both. There is no one single formula or threshold because every competitive niche is unique, however I've seen it and had clients reach that point over time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Country Redirection Change
Analytics is showing a substantial decrease in referring traffic from Google specific regional domains like .ca, .co.uk, .de, etc vs an uptick from .com starting as of March 2018. Did anyone note when this change happened when Google stopped directing traffic to their regional domains? Was there any press about it (couldn't find any). Using a VPN for different countries, I compared regional specific domain SERPs vs .com and they're pretty much identical. Thanks!
Algorithm Updates | | Bragg1 -
Is there any way to prevent Google from using structured data on specific pages?
I've noticed that Google is now serving what looks like host-specific video cards on mobile for our site. Is there any way to control which videos are included in these lists without removing the structured data on those clip pages or user pages? We don't want to noindex those pages but we don't want content from those pages to appear as video cards. 1kzPW
Algorithm Updates | | Garrett570 -
Ranking dropped with no page changes
My rank for a keyword went from ranking #1 to #22. The page grade for this keyword is A, there was no site structure changes. The only thing I can see is that tumblr and reddit and other sources are now listed for this keyword and it's difficulty went from the mid-low teens to 28%. However, even given that, I do not a see a reason for this keyword alone to fall so far. It was giving us a ton of traffic, in fact, most of our organic search results came from this term for nearly two months. And 2 weeks ago for no reason, we were pushed to page 3. Has anyone else had similar experiences how do you counter it, and what can we do?
Algorithm Updates | | mozmemberanon0 -
PPC vs Organic CTR
Hello, I found two studies that seem to contradict themselves about PPC vs Organic CTR:
Algorithm Updates | | Cornel_Ilea
http://searchenginewatch.com/article/2200730/Organic-vs.-Paid-Search-Results-Organic-Wins-94-of-Time
http://brandongaille.com/google-organic-click-through-rate-statistics/ Which one is true? Thank you
Cornel0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Boosting Organic Search
Hi there, I have been analysing the performance of my keywords through SEOMoz reports for some time now and I am trying to understand why I rank highly in certain keywords but do not receive any organic search visits for them? My pages are tagged with the keyword(s) and my content including new content through my blog pushes the words. These keywords are industry standards that I know people search for and are used by other companies and competitors and yet, my site does not receive many, if any, visits despite being ranked in the top 5 or 10. Any help or advice would be much appreciated!
Algorithm Updates | | sparkit0 -
Impressions & Traffic WAY Down. Where to start?
Beginning around November 1st, I began to notice a continual, gradual drop in impressions and traffic. During the holiday season we typically see a decline in business so I initially passed it off as that, but there has been no rebound and I'm really confused on where to begin looking to figure this out. Daily impressions have now dropped from 20,000 all the way down to 5,000 and it has taken a major toll on the business (see attachment for graph of this). Some Background Information: My Site has been very static for the past 8 month's (since April '12). Admittedly Overly static with very little other than a blog post here and there added. However, during these 8 month's traffic jumped 30% so we were riding that wave and feeling confident that our past efforts built a great foundation. I'm not aware of anything even remotely black hat that has ever been done. Everything is very much on the up and up and done with the user in mind. I'm unable to track anything to a Panda update due to the consistent, gradual nature of the decline. However, with some important search queries completely falling off the map, it feels to me like we are being penalized or affected by a permanent algo change. In GWMT that are a variety of important search queries that show a change of -100%. These terms do show an average position, but when I manually search for them they are no where to be found in Google search results. This is very strange to me. It feels like we've been blacklisted for some of our more important keywords. We had a major site relaunch on January 20th (a week ago). However the downward trend was in place well before this. The site is www.mycreativeshop.com To sum it up, I'm extremely confused and very concerned with what this drop is doing to the company. I've never been in this position as we've worked very hard to lay a solid foundation and have always seen a continual, positive traffic increase. It then seemed to just start turning downward one day and won't stop. If anybody has some suggestions of how to try to get to the bottom of this and learn what is really taking place it would be greatly appreciated! Thanks, -J Wdab7Sk
Algorithm Updates | | cre80 -
Would Google Remove Pages for Inactivity?
Hi, I've been watching the Total Indexed number for 4 domains that I work with for the last few months. In Google Webmaster Tools three of them were holding steady up until August-September, when suddenly they started declining by hundreds of thousands of URLs a week. I've asked my IT department and they say they haven't done anything technically different in the last few months that would affect indexation. I've also searched on google and on search marketing blogs to see if anyone else has experience this to no avail. As you can see in the image, the "Not Selected" pages have not increased so it appears this is not due to duplicate content (of which we have a lot). However, the "Ever Crawled" number is increasing. The only reasonable answer that I can conclude is that Google is now de-indexing inactive URLs? Anyone have a better answer? yIYDm.jpg
Algorithm Updates | | OfficeFurn0