Best free way to make our NAPs consistent - online software maybe?
-
Hello,
What's the best free tool or method to making our local SEO citations consistent? We have more than one name and phone number out there and there are a lot of citations already.
-
Thanks for adding your comment on this SirMax. Very creative and not something I'd thought of!
-
Pro members have a perk with Yext. 50% of resale account
Resale account after the discount is $75 per year. Resale account includes a powerlisting, so for $75 you can update a 20-30 sites which is not that expensive. The problem with yext is that once you stop paying them, your listings would disappear. So the best strategy is to use Yext to get the updates in there fast and then still go and update the listings manually. Make sure to keep the spreadsheet with all the sites and passwords so you can track your progress.
-
OK. Will do it via spreadsheet and searches. Thanks Miriam.
-
Hi Bob, I'm not aware of a reliable free tool for managing citation accuracy. Yext's paid tool would probably do what you want, but it is costly. Really, the best way to do this is search for your business names/phone numbers, make a spreadsheet of everything you find indexed and then go through the process of claiming and editing any citations you can. It's a hard slog, but worth it to achieve better consistency of data across the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to find related forums in your industry?
Hi Guys, Just wondering what is the best way to find forums in your industry?
Intermediate & Advanced SEO | | edward-may2 -
Is there a way to increase domain authority?
Dear all, when I see moz analytics for my blog irctcloginindia.co.in, it is legging behind only in terms of diomain authority when compared to my competitors. Because of which it is ranking low. Is there any short cut or fast method using which I can increase the authority for my domain.
Intermediate & Advanced SEO | | irctclogin0 -
What is the best way to get anchor text cloud in line?
So I am working on a website, and it has been doing seo with keyword links for a a few years. The first branded terms comes in a 7% in 10th in the list on Ahefs. The keyword terms are upwards of 14%. What is the best way to get this back in line? It would take several months to build keyword branded terms to make any difference - but it is doable. I could try link removal, but less than 10% seem to actually get removed -- which won't make a difference. The disavow file doesn't really seem to do anything either. What are your suggestions?
Intermediate & Advanced SEO | | netviper0 -
How best to structure wordpress site.
I need help on how to structure my wordpress site to avoid duplicate content issues. Basically I have a main category page for each of my targeted keywords (about 12). From each of those though I want to create a category for each county in the uk and then about 15 towns within each county. This means I'm creating a LOT of categories. Eg: /plumbers/lincolnshire/lincoln x 15 other counties and towns /local-plumbers/cambridgeshire/cambridge x 15 other counties and towns (I have about 12 main keywords I'm going after) I'm basically creating a category for every town in the UK going after long tail keywords. What is the best way to manage this in wordpress? Advice from another question I posted on here is to write a unique category description for each one as the posts in each category are almost identical. The other problem here is I'm ending up with hundreds of links on a page. (They can't all be seen by the user as I'm using a drop down menu plugin). Any advice appreciated.
Intermediate & Advanced SEO | | SamCUK0 -
Making sense of MLB.com domain structure
Although the subject of subdomains has been discussed quite often on these boards, I never found a clear answer to something I am pondering. I am about to launch a network of 8 to 10 related sites - all sharing a the same concept, layout, etc. but each site possessing unique content. My concept will be somewhat similar to how MLB.com (Major League Baseball) is set up. Each of the 30 teams in the league has it's unique content as a subdomain. My goal in the initial research was to try to find the answer to this question - **"Do the subdomains of a network contribute any increased value to the Root Domain? ** As I was trying to find the answer to my question and analyzing how MLB.com did it, I began to notice some structure that made very little sense to me and am hoping an expert can explain why they are doing it the way they are. Let me try to illustrate: Root Domain = http://mlb.com (actually redirects to: http://mlb.mlb.com/index.jsp) This root domain serves universal content that appeals to all fans of the league and also as a portal to the other subdomains from the main navigation. SubDomain Example = http://tampabay.rays.mlb.com/index.jsp **Already there are a couple of questions. ** 1. Why does MLB.com redirect to http://mlb.mlb.com/index.jsp ? - why the mlb subdomain? 2. - Why two subdomains for tampabay.rays.mlb.com/index.jsp.? Why not just make the subdomain "tampabayrays", "newyorkmets", "newyorkyankees" etc. **Here is where things get a little more complicated and confusing for me. **
Intermediate & Advanced SEO | | bluelynxmarketing
From the home page, if I click on an article about the San Francisco Giants, I was half expecting to be led to content hosted from the http://sanfrancisco.giants.mlb subdomain but instead the URL was: http://mlb.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_mlb&c_id=mlb I can understand the breakdown of this URL
YMD = Year, Month, Date
Content ID = Identifying the content
VKey = news_MLB (clicked from the "news section found from the mlb subdomain.
c_id=mlb (?) Now, if I go to the San Francisco Giants page, I see a link to the same exact article but the URL is this: http://sanfrancisco.giants.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_sf&c_id=sf It get's even stranger...when I went to the Chicago Cubs subdomain, the URL to the same exact article does not even link to the general mlb.mlb.com content, instead the URL looks like this: http://chicago.cubs.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_mlb&c_id=mlb When I looked at the header from the http://chicago.cubs.mlb.com ULR, I could see the OG:URL as: http://sanfrancisco.giants.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_sf&c_id=sf but I did not see anything relating to rel=canonical I am sure there is a logical answer to this as the content management for a site like MLB.COM must be a real challenge. However, it seems that they would have some major issues with duplicate content. So aside from MLB's complex structure...I am also still searching for the answer to my initial question which is - **"Do the subdomains of a network contribute any increased value to the Root Domain?" For example, does http://tampabay.rays.mlb.com/index.jsp bring value to http://mlb.com? And what if the subdomain is marketed as http://raysbaseball.com and then redirected to the subdomain? Thanks in advance. **0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
What is the best process to move a wordpress website ?
Hello Seomoz community, Simple question , i am looking forward to move a word press website from blog.domain.com sub domain to domain.com/blog to increase my indexed link on the root domain indexed by search engine.The blog i want to move already have high PR ( 6 ) i , of course want to avoid broken link , already indexed in search engine. What would be the best way to process to prepare this move accordingly on a SEO perspective ??? Many thanks in advance. Yan Desjardins
Intermediate & Advanced SEO | | SherWeb0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0