Local SEO Best Practices
-
Hello Everyone,
I'm new to SEOmoz, I'm looking to use this as a tool to really help me, and evenually I can help others. I am an Web Developer with some online marketing experience. I did Local SEO a Few Years ago, and things have really changed since then. I know this Panda and Penguin update really is putting a hurting on the directory submission. Google no longer has 'Citations" on their places page, and many other changes.
With that being said, what are some best practices for Local SEO? I am a propeller head by nature, but am also very creative when I need to be. I have potental sites to market, anywhere from Holistic Medical Doctors, Plastic Surgeons Community Blogs, and Auto Repair Shops, Law firms (to give you some perspectic)
I also read Danny Dover's Book, to learn some more about SEO, the one thing that is unclear is how to acquire quality links
I would really appreciate any perspective on this, every little thing helps
Zach Russell
-
Hi Zach,
Welcome back to the Local jungle! The most significant change in Local that has taken place in the last year is how significant a part the website plays in a business' overall ranking abilities. If you were doing Local SEO a few years back, the SERPs were filled with business ranking highly (in the old pack-style rankings) that didn't even have websites. This is rare now. Instead, you've got to have a great website, a clean Places record, review acquisition, citation acquisition, and in competitive verticals or populous regions, good links.
Local business index listings are the first place to start with linkbuilding. I recommend that you read David Mihm's Local Search Ranking Factors Report from 2011 to catch up on some of the top local business indexes. Also, read Myles' Anderson Top 50 Citation Sources for the USA and UK at Search Engine Land. While not all citations sources are link sources, many of them are.
Beyond this, what Matt Williamson is describing is pretty much what you'll be doing: building quality content that earns links. And, definitely take the time to memorize the current Google Places Quality Guidelines which have changed rather significantly over the past few years. Check back often because they keep changing them!
I hope this gets you off to a good start. Good luck and have fun! Miriam
-
Local is the only place I always recommend using local directories (Google Places, Yahoo Local, YellowPages, Hot Frog, etc...) the directory listings may not have much link value, but they definitely work to drive traffic and they usually rank well on their own.
-
Hi Zach,
Whether it is local or international the process for gaining quality links is the same you need to create great quality content and then you need to make sure that it gains the right exposure. This will bring you the quality links you desire to increase your sites rankings. There are lots of methods for creating decent content, such as product/service reviews, blogs including promoting guest blogging from authorities in your niche, examples of work you have carried out (case studies), content curation (see SEOMoz Blog) and more.
To gain exposure it is important to interact with as many potential customers and also others involved in your niche as possible, this is where social networking comes in. Building up your social following and then interacting and building relationships with authority figures that are related to your niche will allow you to expose them to your content through passing on links through tweets etc. If you have built a good relationship this will cause your social followers to share it to their own network and then you will gain increased exposure which will lead to likes, G+'s, tweets and links from other sites. It is important to make sure you include social bookmarking on your websites to help facilitate this. If you are blogging and you have built relationships with authority figures you can then ask them to create a guest post on your blog - you might need to offer an incentive. They will bring the attention of their followers with them which will lead to more exposure and links - improving the authority of your site.
For local SEO it is so important to network online and identify local influencers as this will bring you business - I have found that social networking allows the online word of mouth about decent local business to happen an a much quicker rate - this is flourishing!
I hope this helps - one suggestion is you look into the link building discussions on here for more ideas.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Audit my SEO Project
Hey professionals, I works on "MyInfo Community" as a SEO worker, anyone can help me to audit my this project? Because i am newbie in this field. Thanks!
Intermediate & Advanced SEO | | smartpoedgr0 -
Webjaguar SEO shortcomings
Hey All. I have a client whose ecommerce site is build in Webjaguar. Does anyone have experience with this platform. It appears to be loaded with technical SEO challenges (duplicate content, weird URLs, etc). Interestingly, when I Google "webjaguar SEO challenges" and things like that....nothing comes up. Suspicious, methinks. I appreciate any thoughts from SEO folks. Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
Robots.txt Blocking - Best Practices
Hi All, We have a web provider who's not willing to remove the wildcard line of code blocking all agents from crawling our client's site (user-agent: *, Disallow: /). They have other lines allowing certain bots to crawl the site but we're wondering if they're missing out on organic traffic by having this main blocking line. It's also a pain because we're unable to set up Moz Pro, potentially because of this first line. We've researched and haven't found a ton of best practices regarding blocking all bots, then allowing certain ones. What do you think is a best practice for these files? Thanks! User-agent: * Disallow: / User-agent: Googlebot Disallow: Crawl-delay: 5 User-agent: Yahoo-slurp Disallow: User-agent: bingbot Disallow: User-agent: rogerbot Disallow: User-agent: * Crawl-delay: 5 Disallow: /new_vehicle_detail.asp Disallow: /new_vehicle_compare.asp Disallow: /news_article.asp Disallow: /new_model_detail_print.asp Disallow: /used_bikes/ Disallow: /default.asp?page=xCompareModels Disallow: /fiche_section_detail.asp
Intermediate & Advanced SEO | | ReunionMarketing0 -
Negative SEO + Disavow
My site is very new (~1 years old), but due to good PR we have gotten some decent links and are already ranking for a key term. This may be why someone decided to start a negative SEO attack on us. We've had less than 200 linking domains up until 2 weeks ago, but since then we have been getting 100+ new domains /day with anchor texts that are either targeted to that key term or are from porn websites. I've gone through the links to get ready and submit a disavow... but should I do it? My rankings/site traffic has not been affected yet. Reasons for my hesitations: 1. Google always warns against using the disavow, and says "you shouldn't have to use it if you are a normal website." (sensing 'guilty-until-proven') 2. Some say Google is only trying to get the data to see if there are any patterns within the linking sites. I don't want the site owners to get hurt, since the villain is someone else using xrumer to put spammy comments on their site. What would you do?
Intermediate & Advanced SEO | | ALLee0 -
Local and Organic Listings
Hi, My client has a number of stores across the country (UK) and ideally I would like them to appear in both the local and organic listings - at the moment I appear more often than not on page one for one or the other - I have noticed however that some pages appear in both. I understand that Google will not place a listing for the same page in both local and organic so I need to optimise a page on the site for organic and point my local listing to a different page (home page?). On some results though I am seeing my local result appearing with the home page URL listed but the actual link points to the internal store page which is the same page that is appearing in the organic listing (both on page one). Other local listings of mine appear with the store page URL showing in the result. I haven't set anything up differently for these stores. Can anyone explain why this is happening? Thanks, Dan
Intermediate & Advanced SEO | | SEOBirmingham810 -
ECommerce product listed in multiple places, best SEO practice?
We have an eCommerce site we have built for a customer and the products are allowed to appear in more than one product category within the web site. Now I know this is a bad idea from a duplicate content point of view, But we are going to allow the customer to select which out of the multiple categories the product appears in will be the default category. This will mean we will have a way of defining what the default url is for a product. So am I correct in thinking all the other urls where the product appears we should add a rel canonical to these pages pointing to the default url to stop duplicate content? Is this the best way?
Intermediate & Advanced SEO | | spiralsites0 -
SEO from links in frames?
A site was considering linking to us. Their web page is delivered entirely via frames. Humans can see the links on the page, but it's not visible in source. I'm guessing it means Google can't detect the links, and there is no SEO effect, but I wanted to confirm. Here's the site: http://www.uofc-ulsa.tk/ Example links are the Princeton Review and Kaplan on the right sidebar. Here's the source code: view-source:http://www.uofc-ulsa.tk/ Do those links have any SEO impact?
Intermediate & Advanced SEO | | lighttable0 -
Best practices for handling https content?
Hi Mozzers - I'm having an issue with https content on my site that I need help with. Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly. Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently. One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket. Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place. I guess my question is really about best practices when using https. How can I avoid duplication issues? When do I need to use rel=canonical? What is the best way to do things here to avoid heavy maintenance moving forward?
Intermediate & Advanced SEO | | CodyWheeler0