Moz crawl error? DA & Linking domains significant drop.
-
Hi guys!
Our site www.carwow.co.uk appears to have been punched in the face by the latest Moz update. It's claiming our #linking root domains has dropped from 225 to 135 and has subsequently hit our DA from 38 to 35.
We haven't disavowed any links and our off-site strategy has been going well the past 2 months. Search performance has increased by around 15% (around 5k sessions) and rankings have improved week on week.
Any idea if this is a Moz error? That's almost a 50% drop in linking root domains.
Thanks,
James
-
Great - thanks Keri & mathamatix. I'll ping over an email to help@ now.
JP
-
My domain authority dropped by 2 points and page authority dropped by 1. From what I gather DA/PA are meaningless metric but I suppose it feels good for both to be high.
-
The good news is that the search engines don't use the Moz metrics in their updates, if that makes you feel any better. It could be that we just didn't crawl some of the domains that we crawled previously. I haven't seen any other reports of lots of linking root domains missing. Could you send in an email to help@moz.com so we can send it to the proper team to make sure there wasn't an error in the latest update?
Thanks!
Keri
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Location Data Batch Updates via the MOZ API
According to the MOZ API documentation, I am able to update multiple Locations in a batch in order to create or update their location data, currently 130 locations. I have successfully created a batch and the API returned the $id, as I expected. Yet, it doesn't make clear how the multiple locations I want to update are supposed to be sent back to the API. I was expecting an upload of a CSV file or JSON data, and not Query Parameter as noted in the docs. When including a JSON file, properly formatted, as a binary upload, the response is still expecting a locations parameter. See error here: { "status":"MISSING_PARAMETER", "message":"locations missing. ", "response": { "locations":"MISSING_PARAMETER" } } https://moz.com/developers/docs/local/api#_api_batch__id-PATCH
API | | yuca.pro1 -
API url_metrics Error deserializing POST body
Getting error in https://lsapi.seomoz.com/v2/url_metrics API I'm using Basic Auth for authentication. { "name": "Bad Request", "message": "Error deserializing POST body"} CVW2T4f
API | | verdet32321 -
Get discovered and lost linking domains separately
Hello, We want to get "discovered and lost links" using api.
API | | OlegKireyenka
You do not have discovered and lost metrics available via the Links API and we call api for backlinks and then cross reference this with the links found the last time, we are able to get only NET links, but we want to get "just discovered and lost links" separately.
for example: inbound links for 27 July is 600, inbound links for 30 July is 750, okay, NET links from 27 July to 30 July = 750-600=150. And in this case "discovered and lost links" maybe: 400 and -250, 300 and -150, 150 and 0, and etc. Maybe I can not understand something, give example please. Thank!0 -
Moz Update
Hi, in the past we could see the next Moz update in https://moz.com/products/api/updates, but i can´t see it anymore there, has been changed to another place?
API | | Agenciaseomadrid0 -
API for crawl reports?
Hello, I'd like to know if moz api is able to provide me the crawl reports. We're currently developing a simple crawl report analysis/parsing tool and after analysing the API we weren't able to find a function that could return us wether the excel file or the report results. Is there any way? Thank you very much.
API | | PLOT_PT0 -
Can the API Filter Links with Certain Anchor Text?
I am trying to get all links that have a certain strings in their anchor text: I am using the python library: https://github.com/seomoz/SEOmozAPISamples/blob/master/python/lsapi.py Looking at the documentation, it says I can get the normalized anchor text by using the bit flag 8 for the LinkCols value: https://moz.com/help/guides/moz-api/mozscape/api-reference/link-metrics So I tried this: links = l.links('example.com', scope='page_to_domain', sort='domain_authority', filters=['external'], sourceCols = lsapi.UMCols.url, linkCols=8) But it doesn't return the expected 'lnt' response field or anything similar to the anchor text. How do I get the anchor text on the source URLs? I also tried 10 for the linkCols value, to get all the bit flags in the lf field as well as the anchor text. In both instances (and even with different variations of targetCols & sourceCols), this is all the fields that are returned: 'lrid', 'lsrc', 'luuu', 'uu', 'luupa', 'ltgt'
API | | nbyloff0 -
New Moz reports
The new MozPRo reports do not give me the same level of data as my previous reports on the old platform, including detailed keyword reporting. I recently discovered that even if I schedule a monthly report, I still only get weekly data. Is it just me or is anyone else struggling with these new reports?
API | | Discovery_SA0 -
Are EC2 servers blocked from using the Moz API?
Hi I've created a script that I'd like to use to check a list of domains using the Moz API. It works totally fine on my local machine. However, when I run it from my EC2 instance, it fails every time. To be specific, the response is always empty (the response is an empty json array) when the request is sent from EC2. Is, for some reason, EC2 blocked by the Moz API? Many thanks for your help, Andrew
API | | csandrew0