Hi Alan!
We submit your listing data to all listing partners at the moment a purchase completes. It can take 8-12 weeks (3 months) for each one to process our submissions.
Hope this helps!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Alan!
We submit your listing data to all listing partners at the moment a purchase completes. It can take 8-12 weeks (3 months) for each one to process our submissions.
Hope this helps!
Hi There!
I'm afraid we do not offer an API for keyword metrics.
Our only API offers access to our backlink data, which powers Open Site Explorer
https://moz.com/help/guides/moz-api/mozscape/overview
Hope this helps!
Anytime!
This appears to be a result with Facebook's new locations pages. Each location will have a unique ID we would verify with. The original ID is still connected to the app so I have manually changed this to point to the correct one and your listing is now unpaused
Hello!
Sorry to see your listing being paused. Moz Local does not directly link to Google Maps or Facebook listings URLs. The tool continuously monitors for the complete NAP (Name, Address, Phone) displayed on these pages and if any part of the NAP is removed, this will result in pausing the listing.
Your listing is verified against Facebook: https://www.facebook.com/WoodstockFurnitureOutlet/
The pause is the result of the street address being removed from the page. To unpause, you will need to restore the complete physical address and make sure the data is also matching what you have on Google Maps.
After restoring the address, our tool will automatically check within a week to unpause the listing.
Hope this helps and let me know if you have any questions.
Hi Mark!
You can add more photos (up to 5 max) by editing your listing from the main dashboard and under Additional Information.
For the business name and phone number only, you will need to update it on Google Maps and Facebook (if both exist) and our tool will automatically import it to update within a week after the change.
Hope this helps and let me know if you have any questions!
It does work for single location as it is a cheaper option. So if you were to build your listing manually on the same directories we submit data to, it would cost more. Unless you are using Yext, automated submission tools rely on standard data propagation for each directory to process data. It is known not to be quick process as each directory ingests data at different times/schedules.
if you don't have the time or budget to manually manage your listings, Moz Local would be a quick solution and cheaper solution for data submission.
If you have the time and budget for manual management, it takes longer to claim/verify, then submit the data, however it will be quicker to update than waiting for data propagation.
Our tool is intended for bulk location submissions for agencies and marketers to submit and manage 100s+ locations. We do recommend if you only have a few listings and the time and budget for manual management, it is a quicker process than data propagation.
The links should be more diverse and the external pages are being link to from diverse links and so forth and so forth. There is no exact number of links I can say that would allow our crawler to crawl the links, as it can depend on many factors.
For http://www.eskewdumezripple.com, we are indexing links to multiple category/tag URLs that are contributing to the link count for example:
http://www.eskewdumezripple.com/studio-life/archives/2014/05
http://www.eskewdumezripple.com/studio-life/blog/by_tag/tag/community+engagement
You can use pagination to see what type of links are being picked up here: https://moz.com/researchtools/ose/pages?site=http%3A%2F%2Fwww.eskewdumezripple.com%2F&no_redirects=0&sort=page_authority&filter=all&page=1
HTTP status will indicate wether or not we crawled the page (200 status or No Data).
Hope this helps
This is correct. Also it will be important that the bar is listed completely independent from the hotel name, just as if it was in its own building across the street. The directories we submit do not support containing location identifiers. How it is currently listed on Google as no impact as long as the business name and local phone number is unique.
Hope this helps!
Hi Will
Sorry to hear we have not crawled your site.
There are currently no new links going to any page on the site that will send us back for possible crawling. Also many of the pages indexed are returning 404s.
To stay in an index and potentially have metrics calculated, you want to have continuous and fresh links going to the site to active pages. It most likely that the external links were captured over a span of time within a 160 day window but not enough close together to deem your site important to crawl. These are both the most common reasons we may index links but not crawl.
Hope this helps!
Hi Ruchy!
An address change in Moz Local will be a separate purchase as we need to claim the new location with our partners.
Each physical location is its own listing subscription.
Do you have an existing location in Moz Local?
If so please send us an email at help @ moz.com so we can review your listing and walk you through the process as there is a very specific way to submit a new location and close the status of the old one.
Hi Paul
After connecting accounts, you will need click on the "Claim" button next to the listings, then choose one of the connected accounts in the list.
Let me know if this helps
Hello!
Your site does not have enough links or authority for us to crawl.
Keep link building to all pages on your site and on sites that are frequently visited with high authority which can help increase your score.
Hi Chloe
It's possible that your site's authority is too low for us to crawl your internal pages. The Crawl delay in robots could also be preventing us from crawling.
You will want to keep link building as currently there are no new links being discovered which means we won't revisit your site until we can find quality links from other sites: https://moz.com/researchtools/ose/just-discovered?site=ccsadoption.org%2F&filter=&source=&target=domain&page=1&sort=crawled
Hope this helps
Can you email the full URL call your app is generating to help@moz.com?
Hello!
Anchor text requires a paid API plan in order to access the data as well as Top Pages. Free API access only allows URL metrics and Link Metrics calls.
Hi Zoran
I'm afraid this request is only available through paid API plans: https://moz.com/products/api/pricing
Hi Brett!
Google's Places API will only make brick and mortar locations available to search in their database. If you are a service area business, the listing will not be available to us and others as we only support listings with physical locations.
Hi MJ
Our new crawler has been released last week which supports SNI.
Check out our announcement here: https://moz.com/blog/new-site-crawl
Hello!
Can you provide a screenshot of what you have in link_service.php on line 80? This appears to be where the undefined error is reporting for. You may want to consult with a developer to troubleshoot as we are limited to providing dev support.
Hi Steven!
We have always had separate API plans. One change for free access is we no longer offer a higher rate limit for Pro subscriptions. So free access will only be limited to 1 request every 10 seconds.
Hi PSLab!
I'm afraid the best option would be to work with YP directly to find a solution. You could see if they can accept a copy of the electricity bill for the location or other ways they may be willing to use.
Hi Lisa!
We have been adding campaigns to our beta for SNI which is now active in recent weeks. You can sign up here: https://moz.com/community/q/moz-pro-our-web-crawler-and-sites-that-use-sni
Hi arb3ux!
It appears your site/server is blocking our user-agent when we start from http://, we receive no response when we attempt to crawl as rogerbot or dotbot
I can reach your site directly by going to https://
So there is a configuration preventing us from reaching http to be re-directed to https
I recommend reaching out to your hosting provider to help isolate the issue. Please let me know if they make any specific changes and I can re-run the test to see if future crawls will be able to index your links.
Hi Kristin
I can confirm your issue is not related to SNI or the new beta crawler as your site does not use SNI.
Outside of SNI and intermediate certificate issues, 803/804 responses will mostly be temporary at the time of the request to reach a page, which is very difficult to isolate unless you see the same URLs appear in each new report.
I wouldn't worry about the spike until the next crawl occurs where you will want to look for re-appearing URLs which will then help you isolate possible issues with your SSL configuration.
Export the CSV for the current URLs to compare with next weeks crawl.
Hope this helps
Do you have any cols parameters specified? I am not seeing any issues with your credentials. The call you provided does not include any cols to return specific data.
Hello!
Would you be able to send us the full API call URL and your IP address to help@moz.com so we can take a look? Please keep your member ID intact in the message so we can verify.
Talk to you soon!
Hi Etienne
There is a public 3rd party tool which also fails to connect which is hosted on AWS:
If you run a GET request to this URL you will see a time out. Your admins can use this tool to quickly isolate any issues as this timeout is the same we are running into.
The SSL issues reported may not block normal browser connections but possible non-browser user-agents which would be another bot or server. We do not have any requirements or TLS, 1.0 and up is supported.
Hope this helps!
Hi Ryan
I don't believe HSTS would prevent issues or this would apply to all https pages on every crawl report. For 804 response for any single page you want to monitor for the same URL appearing in the crawl report every week. Outside of SNI 804s will commonly be temporary issues.
If the same URL appears week to week with an 804 you would have to involve your web dev/host to further investigate.
Hi Etienne
I checked the campaign you have with the 902 response and the issue is with your host blocking AWS IPs. I could not find any SSL or SNI errors for the domain.
You will want to reach out to your hosting provider to make sure they are not blocking any AWS IP ranges. Our IPs are dynamic so we will not be able to provide a range for you.
There are also other issues with the SSL configuration which will need to be looked at which could result in different errors once the IPs are unblocked.
http://www.screencast.com/t/b8qEdNah
You can run a test here for any SSL domain: https://globalsign.ssllabs.com/analyze.html
Hope this helps!
I'm afraid not for bash curls. We do share example code for calling our API using other languages at https://github.com/seomoz/SEOmozAPISamples
That URL is not an API call as it is a direct link to our online web app via web browser.
This will mean we have not discovered links from external sites to pages at those domains
You would only need to enter the domain in the search field on OSE to return metrics you can compare against API results.
OSE is our web app which calls our API.
Hi Carl
The call is malformed with an additional scope parameter: http://www.screencast.com/t/zTqfBb0NuAO
If you remove the extra &Scope=
The call should work
Can you paste the full URL call your app generated minus credentials so I can take a look?
Also to test your results, you can cross-reference by filtering page/sub-domain/root-domain directly on OSE https://moz.com/researchtools/ose/links?site=10.billing.callmydoc.com&filter=&source=external&target=domain&group=0&page=1&sort=page_authority&anchor_id=&anchor_type=&anchor_text=&from_site=
You would only need to adjust the scope to be &Scope=page_to_subdomain
as the above example would only give you results to any page at callmydoc.com
Hey Carl!
We do provide examples in the documentation. They can be filtered in many ways.
Here is a very basic call that will return the first 25 links to all pages going to moz.com which will provide the linking page and the page being link to
The first result will look like this:
[0] => Array
(
[lrid] => 460538186303
[lsrc] => 131443324666
[lt] => "Google Algorithm Change History"
[ltgt] => 68767261786
[luuu] => moz.com/google-algorithm-change
[uu] => en.wikipedia.org/wiki/Google_Panda
Hope this helps!
The link metrics documentation can be found here: https://moz.com/help/guides/moz-api/mozscape/api-reference/link-metrics
Great example for url-metrics!
Hi Carl
We do not have documentation but here is an old post that should work:
https://www.distilled.net/blog/seo/rapid-protoyping-with-the-seomoz-api/
Hope this helps!
Yes the sign up link is in the very first post by Jon
The link is http://goo.gl/forms/LCvL9Ix8JDHfbAvr1
Hello!
I can confirm your site does use SNI which can be checked at https://globalsign.ssllabs.com/analyze.html?d=www.honesthouse.co.uk&latest
If anyone else is unsure about your own site, you can use the above link, replace the URL and click on any server results which will confirm SNI status shown here: http://www.screencast.com/t/GLzjD7gjzjR
As for any workaround, other than disabling SNI completely, the quickest option to get crawled would be to sign up for the beta.
Jon will keep everyone posted with any new status and I can assure you this is our top priority feature we are working on.
Hello!
You will want to create an official Facebook page so Facebook can return the correct information. Data from unofficial pages do not remain consistent and can cause inaccuracy to spread. Our tool will return data directly contained in Facebook's API database so what you see in the front-end could take a while to update. But definitely create an official page and remove any unofficial ones or merge them.
Hello!
I'm afraid there is a limitation with this specific section of our tool that is not compatible with our current PDF generation service we use. Since part of the data here is generated live. The best way would be to print from a screenshot. It is something we are working on to improve all PDF exports in the app which will take some time.
Hope this helps!