Has anybody used Yext or Universal Business Listings as an automated approach to getting clients into all of the many directories? If so does it work? Or does Google penalize in using these automated services?
-
I'm trying to figure out if using either Yext or Universal Business Listings is worth it. They have reseller programs for SEO agencies. I just am curious what other SEO folks think of these services as I'm considering using one of them to automate and save time for clients. If you go to Yext.com or universalbusinesslistings.org you can see these. Curious what others say about these. Thanks
-
Another wrinkle with using Yext. It appears that the links on many of the citations created by Yext, do not link directly to the business. Instead, they pass thru Yext first before being redirected to the business.
-
Hi Wendy,
I agree with William- claim, edit, verify the listings manually.
Listing a business on these sites requires getting to know your client and their services and the process can be very detail oriented. I think Yext lacks this attention to detail.
Has the business moved? Changed their business name? Have tracking numbers? Are they already listed on some of these sites but with information that doesn't match exactly to what you've provided? These are questions that "powerlisting" service providers don't seem to ask. All they need is your current business information and off they go. BUT some time is saved...for now.
Here is another point of view from Nyagoslav at NGS marketing on Yext-
http://www.ngsmarketing.com/why-yext-might-not-be-the-best-fit-for-your-business/
-
Hi Wendy,
Will is correct on all points. Manual profile creation is still considered to be best by the majority of top Local SEOs, but Google will certainly not penalize you for using an automated service. That being said, if you do need to go with an automated service, be advised that you may run into issues with the service not identifying possible duplicate listings and correcting them. So, for that, you would be back to manual again. It's important to understand this, because of the power of duplicates to sap the strength of a profile.
-
Google does not penalize websites for simply having their business listed.
Yext is quite expensive. I have used UBL in the past, and it works well, but the best way is to claim all of your listings manually.
I know that's probably not what you wanted to hear, but if you're looking for a referral for someone to build your business listings, PM me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a new domain, how should i best use it?
Hi all, Recently signed upto Moz and loving what i am seeing at the moment.I do alot of microsites and push both adwords and affiliate programmes and products and earn enough to give me some fun money. For a long time i have wanted to get into the mobile phone market as i have spoken to alot of people who get anything upto £100 a time from affiliate programmes. I have just purchased a new domain after doing some digging around and though that before i dive into doing things how i normally would , i thought id ask advice on how to best utilise the domain i have bought. Ive just purchased newiphone.co.uk and will put up a holding page once all of the registration has gone through and i have access to everything. According to Google keyword planner the term 'new iphone' gets around 368k searches a month. Moz says this is an extremely difficult keyword to rank for with a 80% difficulty rating so i appreciate i wont be at number 1 in a couple of weeks. Id appreciate your ideas and suggestions as a community to see how i go forward. Many tanks in advancePaul
Industry News | | Wilkesy0 -
Google Trusted Stores
Hello, So we sell millions of dollars a month in merchandise - most of that comes from eBay transactions. We do have a script that posts to eBay and we do download our transactions from eBay and process the orders from our admin. Now I feel we will do a lot better in the SERPs if we have the trusted stores quality signal. However; it comes down to this. The conversion pixel. Since the don't pay through the site - do you think we can get away of sending a email to a second conversion page for eBay transactions? Have any of you noticed a boost in SERPs once you were approved with the Trusted Stores? Any advise?
Industry News | | joseph.chambers0 -
Bing beats google to disavow links
You can now disavow bad links in Bing WMT, google has stated they will be doing the same, this should shake up the rankings when many sites get penalties lifted http://www.bing.com/community/site_blogs/b/webmaster/archive/2012/06/27/disavow-links-you-don-t-trust.aspx
Industry News | | AlanMosley1 -
Chrome blocked sites used by Googles Panda update
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results. This got me thinking, it would be very nice to discover what are the exact sites they don't like. Does anyone know if there is an archive of what these sites might be? Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
Industry News | | SpecialCase0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google Directory no longer available?
Now, we will forever not know what is in the Google Directory. I just clicked on the link..... and everything is dead and points you to DMOZ. What does this mean for us? Is DMOZ going to get more editor juice, so submissions are actually reviewed for once? The Yahoo! directory has also been glitching - new submissions have been disabled for over a week now. Any comments?
Industry News | | antidanis0 -
How long after making changes will position on Google be altered?
I'm curious as to how long Google updates take these days? I'm just getting back into SEO after 9 years and I recall back in the day there was a monthly "dance" during which page results were updated. Is it more frequent now? Thanks
Industry News | | celife0