Local Pages Help
-
Hi All,
I have a client who is looking heavily at Google+ Local.
He has a main business, with a number of locational franchises.
He has created a local listing for each of these franchise pages.
The question he has asked is 'How do I improve my rankings for these local listings?'
Now some of them seem to rank well without any work performed to improve them, but some are not.
My question is, What can we do to improve the rankings of Google+ Local listings?
This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!!
Many thanks Guys!!
-
Hi,
I think you need to check this -
http://www.davidmihm.com/local-search-ranking-factors.shtml
This is the most comprehensive list on local SEO that I have come across in my life. Hope you will find it helpful too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any SEO disadvantages with creating pages under a directory page which doesn't exists?
Hi, Let's say we are going to create pages in the URL path www.website.com/directory/sub-pages/. In case this page www.website.com/directory/ doesn't exists or redirected; will the pages created in this URL path like stated above have any issues in-terms of SEO? We will link these pages from somewhere in the website and planning to redirect the /directory/ to homepage. Suggestions please.
Algorithm Updates | | vtmoz1 -
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
Lost 75% of my traffic on Oct 25, help appreciated
So I've been running coolquotescollection.com since 1997 (!) as a hobby project. I lost about 75% of my organic search traffic on the 25th of October, literally overnight. I've been doing a lot of research but I still don't know why I was penalized. Image attached. I naturally thought this was because of Penguin (Oct 17, my drop was Oct 25). However, after checking backlinks I only discovered 11 domains with about 100-400 links each, the major ones were forum signatures and blog sidebars, 6 domains were spam sites / directories. They almost exclusively used the same anchor text (domain name or similar), so this doesn't seem like a black hat attack. Some of the directories used keywords in their urls however (like "funny quotes").
Algorithm Updates | | Sire
1. Is this really enough for such a heavy penalty?
I added these domains to be disavowed today, I'm aware this might take weeks or months to change. I've automated so that pictures gets uploaded to my Facebook page with a link to my page. This started in early 2014.
2. Can Facebook links be considered link spam?
They don't even show up in webmaster tools.
Example: https://www.facebook.com/CoolQuotesCollection/photos/a.510328825689624.1073741825.326096120779563/615403025182203/?type=1&theater I analyzed keywords and the major ones dropped between 2 and 6 positions. Notable exception: I seem to still rank nr 1 for "cool movie quotes" even though page is not optimized for that keyword. Moz warned about over 5000 pages with duplicate content. It was a single page that used a querystring url parameter I have excluded in webmaster tools. I have now entered a canonical link on these pages. Example:
http://coolquotescollection.com/Home/TShirts?url=http-url-example...
http://coolquotescollection.com/Home/TShirts?url=http-another-url.......
3. Could the Google algo penalize this even though I have excluded the "url" parameter? I have a lot of internal links in the page navigation. Can this cause problems? See the absolute bottom of this page where I have 94 links for example: http://coolquotescollection.com/laughs
4. Could a lot of internal links (navigation to page numbers) be the problem? Some more facts: Site is http://coolquotescollection.com/ Domain is 14 years old. The web site launched in Sep 1997, a year before Google! (Not relevant but you might understand why this is important to me). I haven't done any SEO work for at least 12 months, probably closer to two years. The only SEO work I've done is to optimize the pages, no link building at all, no black hat stuff. I'm automatically building a sitemap that contains all pages, see here: http://coolquotescollection.com/robots.txt I've used webmaster tools for years, haven't gotten any warnings. I checked backlinks there, also here from moz and ahrefs. I'm annoyed that a quality content site can be penalized so hard (75% drop) when there are no, or just smaller issues. I'm just lucky this is not my business site, if so I would have gone out of business. Any help in this matter would be greatly appreciated! z3yFNdb.png Cci7vfI.png0 -
Keyword stuffing in URL? Ekk. Help Please.
Okay, so I work as content manager in the travel industry and we're re-doing our site, pretty much from scratch, including the SEO, anchor text/route url, etc. I am struggling with one particular thing. If all my url's have similar keywords, ie example.com/atlanta-trip and example.com/boston-trip and so on and so forth for every destination, will using "trip" in the url be seen by Google as keyword stuffing? Should I make my url's more diverse? My gut feeling is no based on all the Moz, Google and other SEO research I've done, because it's all relevant to the content and the user experience, but I'd like to be sure, since we really can't afford to get penalized by Google...again.
Algorithm Updates | | hpeisach0 -
Dropped from Universal Result: Local
For quite some time our Google Places listing has been in the Universal Results...(for this keyword there is a 7-pack result). Which was great, we had a PPC ad at the top of the page, we were 3rd in the Universal Results (there was 3 places listings before the natural results)...and we were 6th in the natural results - meaning we were on the first page 3 times...which means a happy boss....and lots of traffic. The old places listing was linked to our new Google+ Page pending the eventual demise of places and the merge. The merge has happened, all information from the places listing has migrated (apart from reviews and photos??) and the places listing has been deleted (URL returns 404 error). Problem is now my Google + Page is not even within the first 2 or 3 pages of places results never mind in the Universal results. So it would appear the rank / authority that the places listing had...hasn't been transferred to the Google+ page. My competitors...who were in 1 + 2 in the universal results above the natural results and who have Google+ Pages with NOTHING on...bar their name, are still there! Why would I be dropped when my Google+ Page, has more info, more followers, more photos, more relevant content (they don't have any content ) than my 2 competitors. It seems I've been penalised....somebody suggested that I had the keyword twice in my "About" and twice in my "Introduction" info and that could be it. I thought the loss of the review might be it too...but neither of the businesses now occupying the first 3 spots..have any reviews at all. Anybody else suffered from this? Anybody any other suggestions to why I might have been dropped so dramatically in the places listings? (My SERP listing is unaffected for this keyword) Keyword being mentioned twice hardly seems like "stuffing"! I'm actually not too concerned about the places ranking....not a great driver of traffic...but appearing in the Universal Results did obviously drive traffic...and to appear in the Universal Results...I've now got about 30 positions to climb...... The whole Google+ Local / Google Places thing has been a nightmare from start to finish.... Thanks in advance for any help or advice!
Algorithm Updates | | MarbellaSurferDude0 -
Local search ranking tips needed
Hi there, I've been working on my clients website for a while now. About a month ago I created him a local business listing in Google. I was wondering if there are any new tips to get his business up the rankings in local search? I've researched and only really found information relevant to the old way Google displayed local search.
Algorithm Updates | | SeoSheikh0 -
Organic ranking vs Local ranking
After reading this blog entry from Dr Pete on Mirametrix, my question would be:
Algorithm Updates | | echo1
What's more important for a local company, being in the 7 pack or in the top 10 organic results? Which one attracts more clicks? Is the optimization for local ranking just became more important than the traditional SEO?0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0