Best local listings submitting service
-
I'm building several local websites and looking for the fastest, cost effective service out there to list them in all the major local directories like yelp etc..
I've been using UBL but 3 months later i don't see my listing in many of the directories they claim to list at...
Thanks!
-
I just wanted to get other SEOs opinion about Localeze.com (Local Directory submission tool) recommended at; http://getlisted.org/enhanced-business-listings.aspx . We paid $3500 to use their system, entered few companies to test. We were very careful to enter proper and recommended NAP and categories. After 5 months, we checked 100 + directories they claim that the listing data would be sent. However only 5% of the places showed our listing. Has anyone used Localeze services? what are your experiences? We feel like we wasted our time and money usin Localeze services.
-
When I think of UBL, it's more for national listings. I used Localeze. Check out getlisted.org.
-
Please have a look at this question about local listing services : Manual submission trumps auto submissions in many ways. I know you're looking for a down and dirty service but like the post states - you can end up causing more work by using a service. I've used UBL and frankly was unimpressed.
The quickest and most effective way to get quality is to submit to the Acxiom, infoUSA and Localeze databases directly as they feed many of the directories you should be looking for. This does 2 things - it lets you control the quality and offers a short list for future changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Google Not Indexing Submitted Images
Hi Guys! My question isn't too dissimilar to one asked a couple of years ago, regarding Google and image indexing, but having put my web address into a Google image search, I get a return of 15 images, so something isn't right. 5 months ago I submitted our 'new' site to Google webmaster. We have just moved it onto a Shopify platform. They (Shopify) are good at providing places to add titles and Alt tags and likewise we fill them in (so that box ticked!) However I have noticed over the last couple of months that despite 161 images being submitted, only 51 have been indexed. Furthermore and as I said earlier, when you put our site, site:http://www.hartnackandco.com into Google images, it only returns a total of 15 images. Any suggestions and help would be wonderful! Cheers Nick
Technical SEO | | nick_HandCo0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Internal links best practices
In looking at the inbound links to a client’s Home page, I see that the link from each page of the website back to the Home page is an image, and the ALT text is “Home.” I have a few questions about this, and would appreciate help understanding best practices: --Does it matter that the link back to the Home page is an image (presumably the client’s logo)? -- If we keep the image link, wouldn’t it be better to use “client’s company name” as ALT text rather than “Home”? --Should I recommend using an HTML link back to the Home page, and using the company name as anchor text? (I don't think it's relevant, but the site is built in Drupal.) Thanks!
Technical SEO | | jrae0 -
Where Are My Listings??
Can someone please explain this to me? I'm not getting a ton of impressions organically on Google, but according to WMT many of my keywords are ranking number 1. This doesn't seem right and when I search they aren't there. This site is pretty new so I don't expect it to rank well yet for these competitive keywords. I am running AdWords, but according the help section, AdWords numbers are not included here. I attached a picture to this post. Does anyone know what's going on? wmt_zps6a3f39a1.png
Technical SEO | | atstickel120 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0