Local search ranking tips needed
-
Hi there, I've been working on my clients website for a while now. About a month ago I created him a local business listing in Google. I was wondering if there are any new tips to get his business up the rankings in local search? I've researched and only really found information relevant to the old way Google displayed local search.
-
Hi Jo Ann,
Local is not about how many websites you have, but how many physical locations a business has. So, one business location = one local listing. If you've got 2 business locations and they each have a unique phone number, that = 2 listings. You can created these listings via the old Google Places dashboard, and that will automatically generate Google+ Local pages, but multi-location businesses cannot merge the social features of a Google+ Business Page with their Google+ Local page at this point. Hope this helps!
-
Hi Associate, If a company has more than one site would you put all the sites under one Google plus site or have different google plus sites?
-
Hey Alex, in the U.S. at least, a good trick for cementing Google's understanding of your client's locations is to manually add each place in Google MapMaker. EVEN IF IT'S THERE ALREADY. Just go through the process and try to add it, when it asks if it's a duplicate, accept that option. It'll then say it's discarding your changes, but click OK and then Save. Mike Blumenthal told me at lunch at SMX Advanced this year that he recommends doing that every couple of months even if there don't seem to be changes. If I remember correctly, he said Google trusts the MapMaker updates a LOT. Pretty sure David Mihm told me the same thing. And from my own client's little projects, I know that they're manually reviewing these, so that makes sense.
See if that helps...it'll just take you 5 minutes to do it, and then a week or two before they review your update.
Google's confidence in a business' current physical location seems to be pretty important to their algo (and it makes sense, as they look foolish anytime they direct a consumer to an empty office!).
MC
-
No problem, good questions! You can include the schema information any way you want, but you'll want to make sure it's nested correctly. See the examples on http://schema.org/Place as an example. If you'd like to format itemprops/itemtypes via CSS instead of doing it in-line, that's perfectly fine.
Search engines generally prefer that schema info isn't hidden. There are some fringe exceptions, but the idea is that you only want search engines reading the local info when it's also relevant for users to do so. Miriam's suggestion to show the location data on each page is a good one for small local businesses.
-
Also.. is there a certain way I should display this schema info or can I just hide it with css? Is it fully controllable via css? Sorry for all the questions.
-
Hi Alex,
It's my pleasure, and thank you for using Q&A!
-
This is all such great advice. i can't thank you enough. I will set at it this week and see if there is any improvement.
Thanks
-
Hi Alex,
So sorry for the acronym.
NAP stands for Name-Address-Phone Number. These are the core factors in your local business data. Everything in local hangs on NAP. It's a standard Local SEO best practice to put the complete NAP of each location on a Contact Page and, typically, in the website footer sitewide.
To further strengthen the signals your NAP is sending to the bots, you can choose to encode your NAP using Schema. Here is the definition of Schema from Schema.org:
What is Schema.org?
This site provides a collection of schemas, i.e., html tags, that webmasters can use to markup their pages in ways recognized by major search providers. Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right web pages.
Many sites are generated from structured data, which is often stored in databases. When this data is formatted into HTML, it becomes very difficult to recover the original structured data. Many applications, especially search engines, can benefit greatly from direct access to this structured data. On-page markup enables search engines to understand the information on web pages and provide richer search results in order to make it easier for users to find relevant information on the web. Markup can also enable new tools and applications that make use of the structure.
A shared markup vocabulary makes easier for webmasters to decide on a markup schema and get the maximum benefit for their efforts. So, in the spirit of sitemaps.org, search engines have come together to provide a shared collection of schemas that webmasters can use.
Local businesses can use Schema to markup their NAP. Here is a generator to help you with this:
*Be sure to choose the 'Organizations' tab from the left menu.
You might also like this:
http://raventools.com/blog/free-schema-creator/#explanation
Hope this helps!
-
Thanks for your response, it waas great. So I guess I learn something new everyday.. today's lesson... what is a Local Business Schema (NAP)!!!!
Many thanks again
-
Hi Alex,
Thanks for sharing your client's identity. My honest opinion on this is that you need to find a top notch Local SEO in the UK to work with, because a full assessment of this is going to be necessary, and I can't really provide that in the scope of Q&A (plus, I'm most familiar with North American Local SEO). But, here's a few things I've noticed:
-
The website really needs some help!
-
It lacks traditional Local 'hooks', so far as I can see. What are these?
-
There is no NAP in the footer. All 3 addresses should be in the footer, preferably coded in hCard or Schema.
-
I know you've got the 3 city landing pages, but you should have a traditional Contact Us page, as well. My opinion is that the bots look for this, and you should have all 3 NAPs again, coded in hCard or Schema.
-
The content on this site is so thin. Only a few words on the homepage, a few on the Roker location page and zero on the 2 other location pages. This truly deserves work.
-
From a traditional SEO and UX viewpoint, the navigation on the site needs quite a bit of work. It is not good that any time I get onto a page, I have to hit the back button to return to the menu that's on the homepage.
-
Also puzzled that of the 11 items in that menu on the homepage, only 3 are links? More evidence that a copywriter needs to be engaged or the owner needs to get writing so that the site has good, unique, rich content for every service.
-
Regarding your question - no, having 3 locations should not have any effect on your rankings, provided you are taking the proper steps to distinguish them from one another. Right now, with the lack of NAP and content, you're not making ideal efforts to do so.
This is just a start, typed up from taking a brief look at the client's site. A full investigation of the client's site, plus citations and other factors is definitely advisable. And, you might also like to check out 51 Blocks' competitive analysis tool for Local Search, though I am not 100% positive it works for the UK yet. It's brand new and I'm not sure in Michael Borgelt (the creator) had included the UK yet. Check it out:
http://www.51blocks.com/online-marketing-tools/free-local-analysis/
Hope my straightforward response on the issues I am seeing at first glance will actually give you and your client hope. There are far more efforts that can be made here, giving you a possibility of moving up in the rankings if you can assess all areas of possible improvement, implement new work and see if it's enough to nudge you up in the rankings. Good luck!!!
-
-
So theres still been no improvement in this. There are 3 business locations within my clients company. Hi Performance 1 in Sunderland, UK, Hi Performance 2 in The Barnes, UK and Hi Performance 3 in Seaham, UK
Do you think this is having a negative affect on the listing? The main listing should be Hi Performance 1. I've tried alsorts but seem to be going further and further down the list now.
-
Hi Alex,
Though the display has changed from Places to +, the work is basically the same. The aspects of Local SEO that have the most impact and over which you have total control are:
-
The strength and optimization of your website
-
Citation building
-
Social Media participation
-
Linkbuilding
-
Avoiding violations of various guidelines
The important aspect over which you have some control is:
- Review acquisition
Other important ranking aspects over which you have little or no control are:
-
Age of domain/citations/links
-
Proximity to centroid
-
Competitors' efforts
If the goal is to move from #3 to #1, you can work to make a superior effort in the first 2 sections of my above list in comparison to the efforts of competitors and hope that this pays off, but the third section of my list is not something you can control. If competitors are older, closer to centroid or making a greater effort, these aren't things you can control, making outranking them quite difficult.
While these recommendations are more or less the same as I would have given prior to the changeover to Google+, people are blogging about their findings as Local changes and grows. If you want to catch up on some of these issues, there are 3 blogs I would recommend you peruse. Go through the posts on these blogs for the past 3-4 months:
http://marketing-blog.catalystemarketing.com/
http://www.ngsmarketing.com/blog/
These are my top 3 picks for really good coverage of the issues, and the three authors also happen to be Google And Your Business Forum Top Contributors, so they not only have their eyes on the ball, they have a special perspective because of their interactive relationship with Google itself.
Hope this helps!
-
-
Thanks for that Matt. Some good resources there, ill definetly give the Whitspark a go. I've already read the post by Rand but it seems to be out of date to the Google that I get when I do a search. I don't get any reviews so can't see where they're coming from. All I've done so far is make sure all of my address and contact details are the same throughout and then I typed in my competitors brand name in google and registered with the same directories as they did as long as they weren't dodgy spammy directories. Most of the appeared to be relative to the website I'm working on.
-
Local SEO is all about citations and matching info.
Make sure the address on your contact page matches your Google+ page and your Bing Local page. If you have other "local" listings - Yelp, etc, match those exactly, too. Phone numbers, contact email addresses, physical addresses.
Then you should work on citations.
https://www.whitespark.ca/local-citation-finder/
Read Rand's thoughts here:
http://www.seomoz.org/blog/one-dead-simple-tactic-for-better-rankings-in-google-local
And check out this list of local citation sources:
http://www.poweredbysearch.com/local-seo-citation-sources-us/
That should help local search and if you do it correctly, you'll find yourself in the 7-pack, 4-pack and whatever-pack Local displays on your related searches.
-
Just to add to this, my clients website has much better weight than its competitors but yet it still ranks on page 3 for local search. What am I doing wrong? The search term I'm using is garages Sunderland.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
A web audit for web traffic? Need answers please..
Hi, We are a PR agency based in Dubai and we produce a lot of web content. The website is build on ruby on rails and we have implemented keywords and SEO strategies but sadly the traffic pattern has not changed since the past three years. What surprised us today that we created a page 2-3 days ago for a client who is participating in Arab Health (a very prestigious healthcare event) and suddenly our page comes on top 3 on google.ae as well as google.com We are kind of convinced that there is something wrong with our code.. Do you think this could be a possibility? and the lack of change in the traffic pattern might not be an SEO issue but a code issue? What could be the possible reasons for this pattern? In such a scenario what would experts like you recommend we do? Do a SEO Audit? Web audit? code audit? hire a seo/ web / code consultant? Thanks - helpful answers are really appreciated and just btw if anyone feels they could professionally help us out of this mess, we are willing to work with him/her. Thanks in advance
Algorithm Updates | | LaythDajani0 -
Rankings drop
Hi! Since the new Penguin rollout, we've seen rankings drop for http://www.leanonturkey.co.uk/ for the terms 'turkey recipes' (UK) and 'low fat recipes' (UK) from #4 and #15 respectively. This may be down to very strong competition, but I'm also concerned over the on-page content, which may not be detailed enough. Any tips would be much appreciated!
Algorithm Updates | | neooptic0 -
Google Search CTR % By Position
Hello I am looking for an updated report regarding the CTR % by position for Google search results. I have the compete.com report which Gives the 1st organic position a 53% CTR but I have not be able to duplicate that number with any other report or research. I am just trying to validate this report before I suggest any recommendations to my company regarding our search efforts. Thank you Ben
Algorithm Updates | | bhalverson30 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
Google Rankings Jumping Around
Hi, Since January, the Google rankings for one of our sites has been jumping around. Sometimes it's on page 1, then it disappears and comes back around 1 month later. It's strange because it's only a small section of the site that it's happening to. Every other section of the site is doing really well. Just wondered if anyone else is having this problem, or has had it and can suggest any fixes. There are no technical issues, no changes have been made to the site, all I can think is it's Google messing around with their algorithm? Any help or advice would be much appreciated. Karen
Algorithm Updates | | Digirank0 -
Dramatic ranking shifts
I’m starting to notice some interesting behavior in Google rankings. It seems like most days, rankings move around but movements are relatively small. Then, maybe once or twice a month, there are days (like today) where everything seems to shift around in more dramatic fashion. I don’t know if it’s algorithm updates, or maybe they do light crawls every day and heavy crawls a few times a month..?? anybody have any idea what is behind these big waves of ranking changes?
Algorithm Updates | | znotes0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0