Local Map Pack: What's the best way to handle twin cities?
-
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located.
However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid.
Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address!
We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
-
Hi,
Two key factors are location and competition. If you are already ranking for your primary area, great. If you are ranking for other locations as an outlier, even better. Now to get the outliers to rank higher is tricky. Because of location and competition. If there are many other businesses in the area where you are an outlier, location is a much stronger metric then many others in this case. If competition was low your chances could be better.
Here are some ideas:
-
Go crazy and get all the local citations directories and creative means like citations in your relevent youtube videos and other assets. Basically create a stellar citation profile. See if this helps if not so much or not at all, that means location and competition are in the way here and are giving precedence to other businesses.
-
If you are showing up on local results as an outlier, if you are showing up that means eye balls are looking, yes you may be lower than other businesses, but if you can get significantly more higher end reviews. This could set you apart and give you that conversion despite an outlier position.
-
Try ranking for organic results for the city + industry you are going for, it might not be too competitive and this might be a better position than as an outlier
-
Increase social signals to your website, and Google local page.
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Why isn't Moz recognizing meta description tags using SLIM?
Hey All, I keep getting reports from Moz that many of my pages are missing meta description tags. We use SLIM for our website, and I'm wondering if anyone else has had the same issue getting Moz to recognize that the meta descriptions exist. We have a default layout that we incorporate into every page on our site. In the head of that layout, we've included our meta description parameters: meta description ='#{current_page.data.description}' Then each page has its own description, which is recognized by the source code http://fast.customer.io/s/viewsourcelocalhost4567_20140519_154013_20140519_154149.png Any ideas why Moz still isn't recognizing that we have meta descriptions? -Nora, Customer.io
White Hat / Black Hat SEO | | sudonim0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
Getting links on competitor's blog
An SEO agency I'm working with has asked if we're okay with guest posting on a competitor's blog. What are the negatives of getting a link from a competitor's blog? Two things I thought of: They can remove the link at any time - why wouldn't you as a competitor? I generally don't want to alert my competition what I'm doing for SEO and how I'm doing it. Is that enough to not pursue those links? Thanks in advance for your thoughts!
White Hat / Black Hat SEO | | pbhatt0 -
Blog commenting - dos and don'ts
Dear Community, I'm getting into blog commenting heavily now for the relationships I'm building with other bloggers. I think the relationships I will build with these other influencers will be helpful. But I'm concerned that Google may penalize my site if I have a lot of links coming from blog commenting. If I sense that a blog is spammy, obviously I stay away. I've also noticed that a number of CommentLuv sites include a link to my latest blog post, and that has helped me greatly in promoting my posts and building readership. I am also interested in the follow links I get from it, but concerned in that regard that (1) Google won't count those follow links (won't pass page rank) and (2) Google will penalize me for some reason or in some way. What does everyone think about this approach of blog commenting, and in particular, including posting some comments on CommentLuv blogs. Thanks! Mike
White Hat / Black Hat SEO | | Harbor_Compliance0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
Negative SEO on my website with paid +1's
Hi guys, I need a piece of advice. Some scumbag played me quite well with paid +1's on my two articles and now I'm in a problem.
White Hat / Black Hat SEO | | Fastbridge
http://sr.stateofseo.com/seo-vesti/google-implementacija-ssl-protokola-not-provided-problem/
http://sr.stateofseo.com/napredni-seo/najnovije-promene-google-panda-algoritma/
They are both translated articles (written originally by me on the same website). I've noticed those +1's (476 on both articles) when my website received a penalty for "SEO" keyword on Google.rs (Serbian Google) and I'm now on the 11th page.
Other keywords still rank just fine. Not cool, right? Now, I think there could be two solutions:
First one is to remove my inner link that's pointing to my homepage with "SEO" anchor, and hope for the best. Second one is to completely remove/delete those two articles and wait for Google to reindex the website and hopefully remove my ban. Do you guy have some other ideas how can I fix this or remove / disavow those +1 or somehow explain to the Google crew / algo that I'm just a humble SEO without any evil thoughts? 🙂 Thank you in advance.0