If i subscribed to the PRO version, do I have access to followerwonk?
-
Im trying to access followerwonk and it keeps saying to subscribe, although I have just paid for the PRO version
-
Hey Damon,
Sorry if you're seeing troubles in Followerwonk. If you're having troubles organizing your reports, can you email us at help@moz.com with more information and some full screenshots of the issue? We'd love to get this taken care of. The reason for emailing us is to protect your privacy.
Thanks!
Joel. -
When I try to analyze followers and arrange them in descending or ascending order, it doesnt seem to let me do that...
-
Yes you do and it is awesome!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When using Moz Pro do I want Google-UK or -GB as default and why?
Hi. When using Moz Pro, I always select Google UK as default search with GB and US next. Is there a reason to choose one over the other? If so why? I am UK based and so are my clients. Thanks
Moz Pro | | YNWA0 -
Followerwonk API (401 Unauthorized)
I am trying to use the Followerwonk api and I have changed the C# code from the MozscapeAPI code here for it to work as it uses the same encryption etc. https://github.com/QueryClick/MozscapeAPI The problem I am having is that I generate the correct link but I still get back 401 Unauthorized from Followerwonk it works fine on Mozscape API. https://api.followerwonk.com/social-authority/?screen_name=maximillion195;AccessID=member-NWQyZTI1M2EtMWUyZC01NDJhLThmZW;Timestamp=1368030369;Signature=PxK24AP7cgXMMVhzrU8NSRtSI%3D (I have changed the member ID and Signature) The Link looks right and it defiantly hashes it correctly and the timestamp is set for 20 minutes in the future, Can anyone explain why this is not working. Thanks
Moz Pro | | intSchools0 -
Does SEOmoz have a Keyword Research tool similar to, say, the Google AdWords tool or the WebCEO Keyword Research Tool? And where might that be? (Sorry, I'm very new to SEOmoz Pro.)
I'm looking for an SEOmoz version of the classic WebCEO Keyword Research that would give you effective suggestions based on a keyword inquiry. I've made the switch from WebCEO, but I'm trying to find something similar to that Keyword Research tool. Am I going to just need to use the Google AdWords tool for this function or does SEOmoz have it's own version?
Moz Pro | | SmokewagonKen0 -
HTC access 301 redirect rules regarding pagination and striped category base (wp)
I am an admin of a wordpress.org blog and I used to use "Yoast All in one SEO" plugin. While I was using this plugin it stripped the category base from my blog post URL's. With yoast all in one seo: Site.com/topic/subtpoic/page/#
Moz Pro | | notgwenevere
Without yoast all in one seo: Site.com/category/topic/subtopic/page/# Now, that I have switched to another plugin, I am trying to manage the page crawl errors which are tremendous somewhere around 1800, mostly due to pagination. Rather than redirecting each URL individually I would like to develop HTC access 301 redirects rules. However all instructions on how to create these HTC access 301 redirect rules are regarding the suffix rather than the category base. So my question is, can HTC access 301 redirects rules work to fix this problem? Including pagination? And if so, what would this particular HTC access 301 redirect look like? Especially regarding pagination? And do I really have to write a 301 redirect for each pagination page?0 -
False Pro reporting of duplicate titles
I am testing Pro. About 250 pages of content at my website. Pro says ALL of my pages have duplicate titles., but when I click on details, they display as unique titles. Ie: first page of results of Pro is as follows. While the content of my website is on one major topic the title meta tags are NOT identical. Is this an issue with Pro, or is Pro looking at something other than the title meta tags? Please advise ? Fiance Visa Help What is Adjustment Of Status from K1 Visa Adjustment of Status support Taiwan US Consulate Visa Interview Adjustment of Status Order Form How to Choose between K1 Fiancee or CR1 Marriage Visa Removal of Conditions on Residence support US Embassies + Consulates that process Fiancee and Spousal Visas | | | | |
Moz Pro | | microonae
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| |0 -
Can SEOMOZ Pro suggest me the best keyword to work for?
Hi, how SEOMOZ PRO can suggest me the best keyword to rank easily in the first position of Google's SERP? Is there a tool to find good keyword and got some ideas? Regards
Moz Pro | | jadlib0 -
Is it possible to override the 10k pages crawl limit on PRO?
Hi There, Just signed up for PRO and I love it! We have a particularly large website (tons of content) and the 10,000 page limit is holding us back from getting really exhaustive analysis. Is there any way to up the limit for a single crawl? Thanks!
Moz Pro | | Richline_Digital0 -
Critical factor Accessible to engine
Hello , i don't understand "Accesible to Engine" - critical factor - that indicate: <dl> <dt>Crawl status</dt> <dd>Status Code: 200
Moz Pro | | lbecarelli
meta-robots: None
meta-refresh: 0; URL=/shop/searchresult.seam
X-Robots: None</dd> <dt>Explanation</dt> <dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd> <dt>Recommendation</dt> <dd>Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)</dd> <dt>My data</dt> <dd>This is the content of my index and home page:</dd> <dd>and this is my file robots content:</dd> <dd>User-agent: *
Disallow: /shop/debug.seam
Disallow: /bhimg/
Disallow:/shop/cart/
Disallow:/shop/G10/
Disallow:/shop/help/
Disallow:/shop/img/
Disallow:/shop/jQueryUI/
Disallow:/shop/js/
Disallow:/shop/layout/
Disallow:/shop/myShop/
Disallow:/shop/newUser/
Disallow:/shop/shop/
Disallow:/shop/staticPages/
Disallow:/shop/stylesheet/
Disallow:/shop/error.seam
Disallow:/shop/login.seam
Disallow:/shop/login.seam
Disallow:/shop/test/
Disallow:/shop/utility/
Disallow:/shop/zoomifyer/</dd> <dd>Tks for any reply.</dd> </dl>0