Why is my DA going down despite doing everything right?
-
I have been doing everything right that I could possibly do on my blog, tracking down and deleting broken links, writing for endless other publications to build my link profile, having an easy to read site with regularly updated content. Despite this, my DA has gone down by two. It's infuriating and I don't know why this is happening - does anyone have any ideas?
-
A decrease in your Domain Authority (DA) despite your efforts to improve it can be frustrating. Several factors could contribute to this situation:
Algorithm updates: Search engines like Google regularly update their algorithms, which can affect how they evaluate websites and determine DA.
Competitor activity: If your competitors are actively improving their own websites, it could impact your relative position and DA.
Technical issues: Issues with your website's technical performance, such as slow loading times or broken links, can negatively impact your DA.
Content quality: Despite your efforts, if your content quality doesn't meet the standards expected by search engines or your audience, it could affect your DA.
Backlink quality: Low-quality or spammy backlinks pointing to your website can harm your DA. It's essential to focus on acquiring high-quality backlinks from reputable sources.
Changes in search trends: Shifts in user behavior or search trends can impact your website's visibility and, consequently, your DA.
To address this issue, continue focusing on best practices for SEO, regularly monitor your website's performance, and adapt your strategies as needed. Conduct a thorough analysis of your website and its performance metrics to identify any specific areas that may need improvement. Additionally, consider seeking guidance from SEO professionals who can provide personalized advice based on your website's unique situation.
-
This happen when the backlinks of your website dropped. You have created new backlinks to boost up your Domain Authority. Here's the complete guide for you.
-
First mistake: to thing you're doing all right.
If DA is going down is because something it's not right. To think that Google made the mistake rating your page is not the right thought. The other answer saying: "SEO is a black box and sometimes weird things without explanation happen". That's bullsh***. Maybe you cannot see it, but the explanation is there.
Once said that, what I can tell you is that maybe you have been using some practices considered black had without your knowledge. Pay special attention with the external links. Well used they can bring you authority but wrong used it can hurts. Learn when to use dofollow or nofollow and how to use the anchor text.
Check the user experience: loading speed of your site, average bounce rate, average engagement time. All those parameters tell Google if users are enjoying your stuff. So check if they are in good rates.
Sometimes there's some things out of your control that can affect your DA. For example if you're sharing the host, maybe one of your "host neighbours" is doing black had, or is a porn page or violence content site. In that case you all have the same IP so you all got penalized. You can check who are you sharing host with. There online free tools that tell you that in seconds. Another option could be someone from the competition trying to hurts you in purpose. They can link to you from bad reputation sites using dofollow links. This could hurts. Check your income links in your search console maybe you can find out something suspicious.
Hope it helped
-
Of course, the domain is wandering-everywhere.com. I haven't used any black hat techniques to my knowledge, and don't believe that I have any spammy links. It's so low as it is and I'm really not sure what I could be doing more,
-
Any black hat techniques have been carried out on your domain? Have you checked on spammy links and used disavow tool?
Can you share the domain?
DA is a long term game, I've seen websites go down and then spike up. SEO is a black box and sometimes weird things without explanation happen.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check why our DA is gradually dropping
Greetings fellow Moz Community, first time asking a question here so kind of excited to be part of the community. I hired an SEO firm about three months ago. They have arranged high ranking DA guest posts for us as well as many other small low DA links. None of these links have any kind of a spam score. However for the past two months they have been helping, our website's DA has actually dropped by one point each month. Is this cause for concern? Any thoughts on how I can trace the drop? On a side-note, out of the 700+ keywords that didn't were ranking in the top 50, about 200 of them now are gone compared to last month. I don't know if this is related or not, but something looks awry.
Technical SEO | | PhillySEO0 -
Despite proper hreflang and lang attribute implementation using xml sitemaps, I'm seeing sitelinks from different countries. Any help please?
When someone searches for our brand in US, instead of only US links, users are served with canadian or iranian sitelinks. Despite we have properly implemented xml sitemaps with hreflangs, even we have implemented lang attribute in the head section of source code for every country. I'd be thankful for any advice.
Technical SEO | | eset0 -
DA/PA has reverted to 1?
Hi, Looking for some advise. I have a local business website that was built and managed by a web developer. The site was/is very basic and really was only there as a place for potential customers to visit after finding out about us via more traditional local marketing. I decided to make the website work for us more and improve the SEO etc to get it ranking better and finding us customers rather than us sending customers to the website. Long story short I wanted to change from an HTML site to a wordpress site to enable me more control over updates/blogging etc. Web developer said he only works with HTML, so I decided to go it alone. As things stand the website hasn't been changed and still remains hosted by the developer but on the 12th of February he transferred the domain to me. Now I'm not sure exactly what my DA was in February but it was at least double figures but now it is 1. As I said only thing that has changed as far as I'm aware is the transfer of the domain to me. I'm at the point were I'm close to doing the transfer over to wordpress. Been working on keywords, content etc to make things better but then noticed my issue. Anybody have any ideas why this would have happened? Or the process I can go through to find the route of the problem before I continue with the change over? Thanks
Technical SEO | | icandoit0 -
Do I need to specify the country in the lang tag? Is this going to affect my ranking?
Currently my website just has I am wondering if it is worth me changing it to instead. Will it change anything?
Technical SEO | | Sally940 -
High DA url rewrite to your url...would it increase the Ranking of a website?
Hi, my client use a recruiting management tool called njoyn.com. The url of his site look like: www.example.njoyn.com. Would it increase his ranking if I use this Url above that point to njoyn domain wich has a high DA, and rewrite it to his site www.example.com? If yes how? Thanks
Technical SEO | | bigrat950 -
Disavow everything or manually remove bad links?
Our site is likely suffering an algorithmic penalty from a high concentration of non-branded anchor text that I am painstakingly cleaning up currently. Incremental clean-ups don't seem to be doing much. Google recommends I 'take a machete to them' and basically remove or disavow as much as possible, which I am now seriously considering as an option. What do you guys recommend, should torch the earth (disavow all links with that anchor text) or keep it on life support (slowly and manually identify each bad link)?
Technical SEO | | Syed_Raza0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Are they going to deindex everyone?
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed. Maybe Google will slowly shut off indexing stage by stage so everyone changes. What are your thoughts?
Technical SEO | | photogaz0