Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain Authority dropped 10 points and I don't know why
-
In my latest site crawl, the domain authority dropped 10 points for no apparent reason. There have been no changes to the site. The only change I have made this month is to block referral spam to the site. My competitors' DAs have stayed the same too.
website name: https://knowledgefront.co.uk/
Any ideas?
-
Check for incoming links and see if there's any pattern. Check referral traffic and see if any bot traffic is landing on your website or not. Try the following steps:
- Create a filter in analytics
- Block spam IPs
- Block bot traffic to land on your website.
- Create quality backlinks
- Find out pages that are performing lower in SERP i.e. 5,6,7 or 8,9th positions. Optimize them and push them for better CTR.
Once you are done, submit sitemap and wait for some time, I hope this will help you out recovering the lost DA as it works for me.
Regards,
Ravi Kumar Rana
TheSEOGuy -
@lisababblebird said in Domain Authority dropped 10 points and I don't know why:
With just a quick look, you've got 53ish links and many of them are questionable at best, or directory links.
So, link quality may be an issue, also, did you manually disavow links? If so, you probably just brought unwanted attention to your site.
We never tell clients to disavow on their own... only if they receive a manual penalty do we take that step.
Instead, focus on building actual valuable links as Google and other search engines have gotten pretty good at recognizing and ignoring spam links on their own.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why can't google mobile friendly test access my website?
getting the following error when trying to use google mobile friendly tool: "page cannot be reached. This could be because the page is unavailable or blocked by robots.txt" I don't have anything blocked by robots.txt or robots tag. i also manage to render my pages on google search console's fetch and render....so what can be the reason that the tool can't access my website? Also...the mobile usability report on the search console works but reports very little, and the google speed test also doesnt work... Any ideas to what is the reason and how to fix this? LEARN MOREDetailsUser agentGooglebot smartphone
Technical SEO | | Nadav_W0 -
Why isn't my homepage number #1 when searching my brand name?
Hi! So we recently (a month ago) lunched a new website, we have great content that updates everyday, we're active on social platforms, and we did all that's possible, at the moment, when it comes to on site optimization (a web developer will join our team this month and help us fix all the rest). When I search for our brand name all our social profiles come up first, after them we have a few inner pages from our different news sections, but our homepage is somewhere in the 2nd search page... What may be the reason for that? Is it just a matter of time or is there a problem with our homepage I'm unable to find? Thanks!
Technical SEO | | Orly-PP0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0