What is the best free 'contact finder' tool?
-
What is the best free 'contact finder' tool?
By contact finder I mean a tool that can search multiple websites and display the contact details of each site.
Why do Moz not provide such a tool?
Thanks
-
mtthompsons
Thanks for the comment. I have had a look and this tool looks useful.
Thanks.
-
Here you go with a excellent tool to collect contact info
http://tools.citationlabs.com/tools/contacts-finder/index.html
Its free for a few after which its cheap 10$ for 100 MB of queries. I tried this and could collect 1000 sites info in 10$
Try it before you buy it..
-
In Buzzstream, you can get all of the links from the past few months uploaded via CSV straight away - the only problem would be if you had a tonne more over the next 14 days.
Good luck all the same.
-
PremioOscar,
Thanks for the comment.
As simple as it sounds, you could well be right. If I can not find an intern student to do it manually, I will settle for social media.
Thanks again for all the comments.
-
Yes it's a one-shot thing, but there are a lot of links that have accumulated over the last few months so will take some time is all.
Thanks for you're help anyway.
-
You could just write a post thanking the people that have linked to you on your social media.
I am sure they will appreciate more than any email you want to send them.
-
I don't know of a free one, to be honest. But if you're just thanking a load of people that have linked to you, it's a one-shot thing right? Or are you scraping for spam purposes after all?!
-
Edlondon,
Thanks, I have had a look at buzz stream and although it does look good, we would not get enough use out of it to make it worth the price, which is why we were really looking for something that was free?
-
Or if you're dealing with a website that requires link-removal.
Buzzstream is good, it's not free but you do get a free 14 day trial: http://www.buzzstream.com/link-building/plans-pricing
-
To simply speed up the process of sending thank-you emails to a number of blog that have linked to our site.
-
to send a bunch of spam?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Best Practices
My question is regarding the URL structure best practices of a sitemap. My website allows search any number of ways, i.e. 1. http://www.website.com/category/subcategory/product 2. http://www.website.com/subcategory/product 3. http://www.website.com/product However, I am not sure which structure to use in the sitemap (which is being written manually). I know that for SEO purposes the 3rd option is best as the link is more relevant to that individual product, but the Moz tool states that the home page should have less than 100 links (although Google doesn't penalise for having more) and by writing my entire site in the 3rd way it would result in a lot more links adjoining to the home page. It is either the 2nd or 3rd option, I think, as the 1st category is not keyword specific (rather a generic term, i.e. novelties). Does anyone have experience with this?
Moz Pro | | moon-boots0 -
Is there a tool on Moz or out on the internet that does bad link checker
I'm still pretty new to this and I was wondering if there is a free software, one of Moz or free out on the the internet that allows you to check bad links. I've done a lot of link building with citations and directories that for my clients industry. I just don't want to add their website and profile to a bad/risky directory and it penalizes my clients. I've seen a few out there, but I need one that is respectable and reliable. Any suggestions? I found one called bad neighborhood http://www.bad-neighborhood.com/text-link-tool.htm. Thanks Again, Benny
Moz Pro | | ACann0 -
Best Chrome extension to find contact emails on a website
Hi, I've done some digging around the Q and A and SEOMoz articles. Still not finding exactly what I need. I'm just looking for a tool that will quickly help me find the best contact email on a particular website. Whether it be the one the site is registered to a different one or both. Thanks in advance for the help. Aaron
Moz Pro | | arkana0 -
What should I put in 'Define Branded Keyword Rules' -Starting a Campaign
Hello, I am a new user here (this seems really interesting!), but english level is not very good (I am spanish) and I dont understand what means 'Define Branded Keyword Rules' Hope someone can explain me it in easy words so I can understand Thank you very much! 1362443047.jpeg
Moz Pro | | matiw0 -
I've tried everything, and my blog still falling
Hello everyone! 🙂 I'm almost desperate, and trying to find a solution to one of my blogs. Below is it URL: http://www.xboxplus.net/ It has more than 3 years, and even so I get just around 700-1000 visitors a day. I have already applied several SEO techniques, tried several SEO plugins, tried to work with keywords, etc, but nothing changes. I see several other sites and blogs of the same niche (games), with much less content (I have more than 3800 articles, in 3 years), practicing plagiarism, stealing content, etc, and with much more visitors and revenue. I have never done nothing wrong. Never stole content. I always worked honestly, and always tried to write articles with "something more". I really don't know what happens with XboxPlus.net . And, how can, for example, Seomoz helps me, here? I would be very grateful for any help. 😄
Moz Pro | | Andarilho0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Our Duplicate Content Crawled by SEOMoz Roger, but Not in Google Webmaster Tools
Hi Guys, We're new here and I couldn't find the answer to my question. Here it goes: We had SEOMoz's Roger Crawl all of our pages and he came up with quite a few erros (Duplicate Content, Duplicate Page Titles, Long URL's). Per our CTO and using our Google Webmaster Tools, we informed Google not to index those Duplicate Content Pages. For our Long URL Errors, they are redirected to SEF URL's. What we would like to know is if Roger is able to know that we have instructed Google to not index these pages. My concern is Should we still be concerned if Roger is still crawling those pages and the errors are not showing up in our Webmaster Tools Is there a way we can let Roger know so they don't come up as errors in our SEOMoz Tools? Thanks so much, e
Moz Pro | | RichSteel0 -
How long has the keyword difficulty tool had these limits in place?
While working against a tight deadline, I was surprised to see the following message: "We're sorry. Currently we are only able to offer results for 300 keywords per user per day. Please come back tomorrow" How long has this limit been in place and is the limit listed anywhere during the signup process? I rarely use this tool for more than 10-20 keywords at a time, so I have not run into this issue before.
Moz Pro | | davidangotti0