Best local listings submitting service
-
I'm building several local websites and looking for the fastest, cost effective service out there to list them in all the major local directories like yelp etc..
I've been using UBL but 3 months later i don't see my listing in many of the directories they claim to list at...
Thanks!
-
I just wanted to get other SEOs opinion about Localeze.com (Local Directory submission tool) recommended at; http://getlisted.org/enhanced-business-listings.aspx . We paid $3500 to use their system, entered few companies to test. We were very careful to enter proper and recommended NAP and categories. After 5 months, we checked 100 + directories they claim that the listing data would be sent. However only 5% of the places showed our listing. Has anyone used Localeze services? what are your experiences? We feel like we wasted our time and money usin Localeze services.
-
When I think of UBL, it's more for national listings. I used Localeze. Check out getlisted.org.
-
Please have a look at this question about local listing services : Manual submission trumps auto submissions in many ways. I know you're looking for a down and dirty service but like the post states - you can end up causing more work by using a service. I've used UBL and frankly was unimpressed.
The quickest and most effective way to get quality is to submit to the Acxiom, infoUSA and Localeze databases directly as they feed many of the directories you should be looking for. This does 2 things - it lets you control the quality and offers a short list for future changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Google vs. default Google search
Hello Moz community, I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results? I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one? Thanks for your valuable input!
Technical SEO | | EduardoRuiz0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
When to re-submit for reconsideration?
Hi! We received a manual penalty notice. We had an SEO company a couple of years ago build some links for us on blogs. Currently we have only about 95 of these links which are pretty easily identifiable by the anchor text used and the blogs or directories they originate from. So far, we have seen about 35 of those removed and have made 2 contacts to each one via removeem.com. So, how many contacts do you think need to be made before submitting a reconsideration request? Is 2 enough? Also, should we use the disavow tool on these remaining 65 links? Every one of the remaining links is from either a filipino blog page or a random article directory. Finally, do you think we are still getting juice from these links? i.e. if we do remove or disavow these anchor text links are we actually going to see a negative impact? Thanks for your help and answers!! Craig
Technical SEO | | TheCraig0 -
Google local listings
im working with gutter installation company, and we're ranking for all the top keywords in google. the only thing that we're not ranking for is for the map results, for the keyword "gutter ma" since we're located in Springfield ma, i thing Google considers certain areas from Boston, because its more center of Massachusetts, What can i do to improve my rankings in maps for this keyword, because i know it wont work with PO box since i need to confirm an address? Thanks
Technical SEO | | vladraush990 -
Give your top 3 of best optimized websites
Hey gents & ladies, Give your top 3 of websites that in your eyes are optimized in a good way? Tell me why you think the website is that good and notice the keywords.
Technical SEO | | PlusPort0 -
Multi- language URL best practices
we have two different content perlanguage (Fr. EN )) they are not Duplicated and they are completly different. what is better for the URL a language sub domain or a folder fr.mycompany.com or mycompany.com/fr/
Technical SEO | | omarfk0 -
Best way to condense content on a page?
We want to add a video transcript to the same page as the video, but it doesn't really fit the design of the page. Is it fine to use CSS/DIVs to either have a "click to read full transcript" or a scroll box?
Technical SEO | | nicole.healthline0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0