What Mystery Local SEO Factors Are At Play Here?
-
Absolutely perplexed on the ranking factors for Google Maps (hence also the 3-pack in normal search results).
Are seeing search queries that return 3-pack and organic result like this and wondering why these sites are getting 3-pack preference?Not that sites 2 and 3 are no closer to the test user's location than Site 4. All 4 sites have a street address showing.3-pack result:#1 - Site 1 - No reviews. Same distance as Site 4 to user. #2 - Site 2 - 1 review for 1 star. Farther from user than site 4. #3 - Site 3 - 2 reviews for 5-star average. Farther from user than site 1, 2, and 4.#4 (not show in 3-pack) - Site 4 - 6 reviews with 6 star rating, closer to user than site 2 and 3.Organic results below 3-pack:#1 - Site 4#2 - Site 4#3 - Other site#4 - Site 1Sites 2 and 3 not in top 10 organic non-map resultsSo what would be the most likely ranking factors keeping making site 1-3 rank above site 4 in the 3-pack/map results?If on-page and backlink factors were at play, you'd expect to see sites 1, 2 and 3 higher than site 4, and in the case of site 2 and 3 at least in the top 10 of the organic results. All sites were similar distance to the user.
-
Well there's another 'mystery listing' in the same search now. Same case, business is not in close proximity, no reviews, poor orgranic rank. It is starting to look like indeed Google rotates in a random listing - sort of like it gives newer advertisers/ads some exposure in the Adwords auction to build some analytics data to see how effective the ad is (to see if they can make some money off it.)
This sort of makes sense from the 3-pack standpoint because businesses listed there will obviously get higher CTR and then would be self-perpetuating so to speak so that if the 3-pack was solely based on reviews, organic rank, CTR, and other aspects, the businesses in the 3-pack would almost never change. So they need to add some sort of random rotational function to give other businesses a "chance" to demonstrate their relevance. So one of the 3-pack spots may be rotating newer listings despite have little or poor local ranking factors such as organic rank and reviews. Just my educated guess based on lots of observations.
-
In addition the schema on the contact page uses the address:
2310 Central Ave, Irwindale, CA 91010 USA
Also not Los Angeles
-
I found the Wild Rabbit company at one point (may still be) had an address in Duarte, about 20 miles E/NE of Los Angeles.
Domain is registered in San Gabriel.
Business license has Woodland Hills and San Gabriel addresses.
If it's a proximity to center point thing I would guess they verified address is NOT one of these addresses.
-
Another thing I noticed about the original search is that there is heavy filtering going on at the automatic zoom level of the map. Once you zoom in, tons of other companies appear. So, this could point to Google lacking confidence in these results.
I found this pack interesting enough to share with Mike Blumenthal, who smartly pointed out the Google has no category for "drone company". Just a theory, but this could possibly be leading Google to have to rely on the signal of what is in the business title, and the company ranking #1 has added the keyword "drone" to their title (though it doesn't appear to be part of their legal business name, and is, of course, then not allowed). So, this could have something do do with the mysteriousness of this pack.
To see the centroid of a city, look it up in Google and click on the map. The spot where Google has placed the city name is the centroid. In this case, the centroid of LA is in the extreme east of the city borders. The company we're looking at lists no address on its GMB listing or website. The website just shows a map of LA. The GMB listing describes the business as being in Glendale, which is a bit to the north of the centroid. You could compare this to the revealed locations of the other two companies and see what you think. It's a good question you've raised.
At any rate, there seems to be a lack of Google confidence in these results.
-
Yes, that's an interesting observation.
Try searching: drone companies in los angeles ca
White Rabbit is still #2 but at least you see a more representative set of listings in the maps results.
Maybe the stark difference in map results between two very similar searches gives us a clue as to what's going on, but I've yet to figure it out.
One thought is for any city search Google has to use some specific location as the "center point" to determine proximity (for us users not physically in Los Angeles). Maybe the actual verified address of White Rabbit is nearest the point Google is using for the center of Los Angeles?
Wonder if there is a way to determine what Google is using as the center point?
-
Hey, that is a good mystery pack! Something seems odd about it. Do you notice that even when you click through on the map, there are only 3 companies, total, showing in the local finder view? Are there really only 3 drone companies in LA? I find that very hard to believe. For some reason I can't identify, Google is acting like it only knows of 3 such companies that match the query. I was expecting to see dozens of them upon clicking through to the local finder view. So, something is odd there.
-
Okay, for those that want an example, I found a good one.
Search: Los Angeles Drone Companies
Why the heck is Wild Rabbit listed #1 in 3-pack?
They are listed position 13 in organic SERPS. They have no reviews. They aren't showing their physical address (so no pin on map). They are in the HUGE market of Los Angeles. The don't have the words 'drone' or 'company' in their page title or content (only in their meta description). They aren't in any of the major directories (other than Yelp) like yellowpages.com or superpages.com
Baffling
-
Hi SEO1805,
Without seeing the actual result, this is shooting in the dark, but I'd look at filters (Possum), factors like domain authority, and the possibility of spam either positively or negatively impacting the results.
If you can share the SERP you're looking at, that might help us dig down a bit deeper on this.
I also recommend doing a complete competitive analysis between the site ranking #1 and the one you are marketing. (See: https://moz.com/blog/basic-local-competitive-audit)
-
Yes, we all realize there are most likely hundreds of ranking factors although I would guess the 80/20 rule applies that 20% of the ranking factors make up 80% of the "weight" in the ranking algo.
One thing we no for sure is that Google's objective is to provide the most relevant search results given the user's intention. So for those of us that are intimately familiar with a specific business or subject area niche and all the players, we can compare the results to our human evaluation of what the real world situation is. You may know company A is the leader in the category with the best service and value and a long-standing history, great customer kudos, etc. So the results should steer you towards that company.
In my 17 years experience, i find it remarkable how on the mark the organic results are on Google. It really puts Bing and other search engines to shame. However I guess the point of this thread, speaking in general terms now, is that I'm not seeing that same AI ability transferred over to the local citation rankings on the 3-pack and Maps Search Results.
It's really in my mind not rocket science. Their organic algo IS rocket science in my opinion but tweeking it for local results is in my opinion a far simpler task by comparison. (a) Take advantage of your existing algo and make that a large part of your local ranking, (b) make proximity to user's location intent much stronger, (c) make backlinks on authoritative local directories or organizations stronger (BBB, Dunns, Chamber of Commerce, etc.) (d) add a bit more importance to user reviews.
What other factors could be as important or more important than those from a local search standpoint? This should be a fairly straight forward exercise in simple logic.
To me it looks like Google has not invested the same brain power in tweaking it's local rankings that it has in it's normal organic ranking algo and so going forward I would expect more significant changes to the local search algo by comparison.
-
First sorry for the typos. I did come up with one difference I know of... citation age. Site #4 is a newer business. But it is in all the aggregators and has proper local schema markup.
No significant pattern regarding page length. That seems to me would be another factor used in the regular organic results so wouldn't make sense Site #4 would rank so much better if it was being demoted on 3-pack due to page length. Site #4 does beat out 2 of the 3 sites in the 3-pack for many other similar searches though. So citation and/or domain age can't be that big of a factor.
I was always under the impressions that closeness to user's location was #1, most normal organic ranking factors was second most important, and reviews were last.
I guess another explanation could be the do some random round robin to agree similar to the Adwords auction in order to test CTR of newer ads.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
Multi location silo seo technique
A physical therapy company has 8 locations in one city and 4 locations in another with plans to expand. I've seen two methods to approach this. The first I feel is sloppy and that is the individual url for each location that points to from the location pages on the main domain. The second is to use the silo technique incorporated with metro scale addition. You have the main domain with the number of silos (individual stores) and each silo has its own content (what they do at each store is pretty much the same). My question is should the focus of each silo, besides making sure there is no duplicate copy, to increase their own hyperlocal outreach? Focus on social, reviews, content curated for the specific location. How would you attack this problem?
Local Website Optimization | | Ohmichael1 -
Local Service pages guide?
There are a lots of Local landing pages guide on the internet. Is there any guide for Local service pages? How to create them, what to include?
Local Website Optimization | | Michael.Leonard0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1 -
Website Mods and SEO for Multi-Location Practice?
We're in the process of taking over a WordPress website within the next week for a 3 location medical practice. These are in 3 different cities. 1 location is in a pretty competitive market, while the other 2 are not. The current site isn't bad for design and navigation and they don't have the budget for a full-redesign. Structurally, it is sound. It lacks a lot of content though and a blog. It is not responsive, should we convert to make it responsive? At first glance you can't tell they have 3 locations and their content for each location and services offered is pretty weak. What other suggestions do any of you have for getting the main site to rank for all 3 locations? I know it'll take some time since they are no where to be found now, but just looking for any other tips you may all have. Thanks!! - Patrick
Local Website Optimization | | WhiteboardCreations0 -
Local Business Schema Markup on every page?
Hello, I have two questions..if someone could shed some light on the topic, I would be so very grateful! 1. I am still making my way through how schema is employed, and as I can tell, it is much more specific (and therefore relevant) in its details than using the data highlighter tool. Is this true? 2. Most of my clients' sites have a footer with the local business info included on every page of their site (address and phone). This said, I have been using the structured data markup helper to add local business schema to home page, and then including the footer markup in the footer file so that every page benefits from the local business markup. Is this incorrect to use it for every page? Also, I noticed that by just using the footer markup for the rest of the pages in the site, I am missing data that was included when I manually went through the index page (i.e. image, url, name of business). Could someone tell me if it is advisable and worth it to manually markup every page for the local business schema or if that should just be used for certain pages such as location, contact us, and/or index? Any tips or help would be greatly appreciated!!! Thanks
Local Website Optimization | | lfrazer0