Local search vs. Organic Listings
-
Hi ~ I was interested to see if anyone feels there might be an advantage to keeping a business out of Google's Local Search listing area or at least trying to keep it out of the 7-pack display? It seems to me that sites who are not listed in the 7-pack can often be ranked above the maps/7-pack area in the regular organic listings.
Also, is there anyway for a homepage to be listed on the 1st page in both the local search and organic listings? Thanks!
-
With respect to local search, Google is providing the most relevant search results relative to location. SO, when considering optimizing for local vs organic, wouldn’t the correct answer be that it depends on the type of business? For example, in a moderately sized metro area like Concord, California, a sandwich shop should weigh heavily in favor of doing everything they can to rank locally because Google will serve up the most local results when people are looking for a sandwich shop (in the immediate area). A law firm, however, certainly wants clients who are close to their office, BUT they can also take clients throughout the city. So if they are optimized for local search, at the expense of organic, wouldn’t they be losing out all of the other prospective clients who search outside of the “local” (immediate) area that Google deems close to the law office? Very few will drive across Concord to get a sandwich (unless it’s Togo’s….I LOVE Togo’s! ;-), but many will make the drive for an attorney if they feel that attorney is the best fit for their complex legal matter.
I have been holding off doing local search optimization for this reason for my law firm clients. They rank very strong for vanity searches, while the “7 pack” are underneath, competing with each other all bundled together. Plus, as I suspect and hopefully someone can confirm, as with the example above, my clients show strong wherever the searcher’s location is throughout Concord, and the others (7 pack) show in the immediate proximity of where the inquiry was made. Is that a fair /correct statement?
-
Hi Billy,
I agree with the comments members have left to the tune of the many variables in display. Your search, for example, may show you 2 organic listings followed by 7 local listings followed by several more organic listings, but your client's same search could be showing him a different display. If your business meets guidelines for local inclusion, then I would always recommend participation to the fullest.
Regarding a double local/organic listing, this is a topic that comes and goes. In the past, it was common for dominant businesses to have multiple page one rankings, but around the time of the Venice Update, this became very rare. This was followed by some Local SEOs experimenting with techniques that did sometimes enable them to obtain double page 1 rankings:
http://www.nightlitemedia.com/2012/05/organic-and-google-places-ranking-on-page-1/
These days, I most commonly see double rankings for searches that relate to geographic areas and/or industries where there is low competition. For example, a bakery in a rural area with few or no other local choices may get multiple rankings on page 1, including both local and organic spots. Check out the 2 posts I've linked to for theories on being able to do this is more competitive verticals, though.
End of the day, though, yes, you are correct that one of Google's common displays at this time puts 1-2 organic listings above the local pack of listings, but I would not see this as a reason not to participate in Local if your business model is eligible.
-
It really varies as searches are tailored for the user more and more, consequentially ranking has become less of a horn tooter because when someone tells me they rank for such and such I ask "where do rank and for whom?" Cause you may not rank for me the same way. I wouldn't shy away from Local as Semantic Search is fast becoming Hyper-Local unless there was an immense amount of data supporting otherwise.
-
That really depends on that area, and how many people outwith the local area search for this keyword.
if your google location is set in your area, ( mine is glasgow ) then i get the snippet of 7 sites, however if i set my location to edinburgh and search for my keyword in glasgow, then no google places comes up and your site is likely to be in a different position.
I actually have a client who ranks middle of google places within glasgow, and top of page 1, for their keyword if you search from outwith glasgow.
This has been the case for a few months now, it is slightly odd.
I can see your point, however depending on the area, and the visitors, who potentially could search for you and not be in your area then you would be holding back your website, which means you would possible hamper your rankings for someone who searches for your products or services outwith your local area.
Just thought that would be worth mentioning.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com vs .co.uk
Hi, we are a UK based company and we have a lot of links from .com websites. Does the fact that they are .com or .co.uk affect the quality of the links for a UK website?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Local business with two separate websites...what to do?
Hey Mozzers! I have a client that I'm helping with some online ad campaigns for lead generation, but they recently had an SEO issue pop up I'm looking into for them. For whatever reason, they have 2 websites. Those are: http://www.healthsourceofroyalpalmbeach.com/ (newer site) http://www.healthsourcedecompression.com/ (older site) Their local listing is connected to the older site (above) and that's where they have all of their reviews. I know the BEST solution is probably to nix one of the sites and setup proper redirects, but how can they keep BOTH sites without damaging their SEO efforts? Currently, BOTH sites rank on page one for their primary kw target "chiropractors royal palm beach fl" Appreciate the help! Ricky
Intermediate & Advanced SEO | | RickyShockley0 -
Noindex search pages?
Is it best to noindex search results pages, exclude them using robots.txt, or both?
Intermediate & Advanced SEO | | YairSpolter0 -
To index or not to index search pages - (Panda related)
Hi Mozzers I have a WordPress site with Relevanssi the search engine plugin, free version. Questions: Should I let Google index my site's SERPS? I am scared the page quality is to thin, and then Panda bear will get angry. This plugin (or my previous search engine plugin) created many of these "no-results" uris: /?s=no-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Akids+wall&cat=no-results&pg=6 I have added a robots.txt rule to disallow these pages and did a GWT URL removal request. But links to these pages are still being displayed in Google's SERPS under "repeat the search with the omitted results included" results. So will this affect me negatively or are these results harmless? What exactly is an omitted result? As I understand it is that Google found a link to a page they but can't display it because I block GoogleBot. Thanx in advance guys.
Intermediate & Advanced SEO | | ClassifiedsKing0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
SEO vs 301
I have a website about "Download of games" and im planning open one about "games online" i know that "games online" its super hard to get good ranks, soo im thinking and do a 301 from my website of "download games" to my new website, do you think that is a good strategy ?
Intermediate & Advanced SEO | | nafera21 -
Wordtracker vs Google Keyword Tool
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
Intermediate & Advanced SEO | | nicole.healthline0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
Intermediate & Advanced SEO | | grayloon0