Best free way to make our NAPs consistent - online software maybe?
-
Hello,
What's the best free tool or method to making our local SEO citations consistent? We have more than one name and phone number out there and there are a lot of citations already.
-
Thanks for adding your comment on this SirMax. Very creative and not something I'd thought of!
-
Pro members have a perk with Yext. 50% of resale account
Resale account after the discount is $75 per year. Resale account includes a powerlisting, so for $75 you can update a 20-30 sites which is not that expensive. The problem with yext is that once you stop paying them, your listings would disappear. So the best strategy is to use Yext to get the updates in there fast and then still go and update the listings manually. Make sure to keep the spreadsheet with all the sites and passwords so you can track your progress.
-
OK. Will do it via spreadsheet and searches. Thanks Miriam.
-
Hi Bob, I'm not aware of a reliable free tool for managing citation accuracy. Yext's paid tool would probably do what you want, but it is costly. Really, the best way to do this is search for your business names/phone numbers, make a spreadsheet of everything you find indexed and then go through the process of claiming and editing any citations you can. It's a hard slog, but worth it to achieve better consistency of data across the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At scale way to check content in google?
Is there any tools people know about where I can verify that Google is seeing all of our content at scale. I know I can take snippets and plug them into Google to see if we are showing up, but this is very time consuming and want to know across a bulk of pages.
Intermediate & Advanced SEO | | HashtagHustler0 -
What is the best way to add semantic linked data to WordPress?
As a recent Moz subscriber, I'm trying to up my game in terms of inbound marketing. One of the most pressing tasks is to add json-ld across all of my WordPress sites. What is the best way to do this? Should I use the technique set out here: https://moz.com/blog/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags Or should I use one of these plugins? https://en-gb.wordpress.org/plugins/schema/ https://en-gb.wordpress.org/plugins/wp-structuring-markup/ I want to get this right so any guidance would be gratefully received.
Intermediate & Advanced SEO | | treb0r0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Best way to remove low quality paginated search pages
I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed. I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Best practice for site maps?
Is it necessary or good practice to list "static" site routes in the sitemap? I.e. /about, /faq, etc? Some large sites (e.g. Vimeo) only list the 'dynamic' URLs (in their case the actual videos). If there are urls NOT listed in a sitemap, will these continue to be indexed? What is the good practice for a sitemap index? When submitting a sitemap to e.g. Webmaster tools, can you just submit the index file (which links to secondary sitemaps)? Does it matter which order the individual sitemaps are listed in the index?
Intermediate & Advanced SEO | | shawn810 -
New URL : Which is best
Which is best: www.domainname.com/category-subcategory or www.domainname.com/subcategory-category or www.domainname.com/category/subcategory or www.domain.com/subcategory/category I am going to have 12 different subcategories under the category
Intermediate & Advanced SEO | | Boodreaux0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Has anyone found a way to get site links in the SERPs?
I am wanting to get some site links in the serps to increase the size of my "space", has anyone found a way of getting them? I know google says that its automatic and only generated if they feel it would benifit browsers but there must be a rule of thumb to follow. I was thinking down the line of a tight catagorical system that is implimented throughout the site that is clearly related to the content (how it should be I guess)... Any comments, suggestions welcome
Intermediate & Advanced SEO | | CraigAddyman0