Why is my DA going down despite doing everything right?
-
I have been doing everything right that I could possibly do on my blog, tracking down and deleting broken links, writing for endless other publications to build my link profile, having an easy to read site with regularly updated content. Despite this, my DA has gone down by two. It's infuriating and I don't know why this is happening - does anyone have any ideas?
-
A decrease in your Domain Authority (DA) despite your efforts to improve it can be frustrating. Several factors could contribute to this situation:
Algorithm updates: Search engines like Google regularly update their algorithms, which can affect how they evaluate websites and determine DA.
Competitor activity: If your competitors are actively improving their own websites, it could impact your relative position and DA.
Technical issues: Issues with your website's technical performance, such as slow loading times or broken links, can negatively impact your DA.
Content quality: Despite your efforts, if your content quality doesn't meet the standards expected by search engines or your audience, it could affect your DA.
Backlink quality: Low-quality or spammy backlinks pointing to your website can harm your DA. It's essential to focus on acquiring high-quality backlinks from reputable sources.
Changes in search trends: Shifts in user behavior or search trends can impact your website's visibility and, consequently, your DA.
To address this issue, continue focusing on best practices for SEO, regularly monitor your website's performance, and adapt your strategies as needed. Conduct a thorough analysis of your website and its performance metrics to identify any specific areas that may need improvement. Additionally, consider seeking guidance from SEO professionals who can provide personalized advice based on your website's unique situation.
-
This happen when the backlinks of your website dropped. You have created new backlinks to boost up your Domain Authority. Here's the complete guide for you.
-
First mistake: to thing you're doing all right.
If DA is going down is because something it's not right. To think that Google made the mistake rating your page is not the right thought. The other answer saying: "SEO is a black box and sometimes weird things without explanation happen". That's bullsh***. Maybe you cannot see it, but the explanation is there.
Once said that, what I can tell you is that maybe you have been using some practices considered black had without your knowledge. Pay special attention with the external links. Well used they can bring you authority but wrong used it can hurts. Learn when to use dofollow or nofollow and how to use the anchor text.
Check the user experience: loading speed of your site, average bounce rate, average engagement time. All those parameters tell Google if users are enjoying your stuff. So check if they are in good rates.
Sometimes there's some things out of your control that can affect your DA. For example if you're sharing the host, maybe one of your "host neighbours" is doing black had, or is a porn page or violence content site. In that case you all have the same IP so you all got penalized. You can check who are you sharing host with. There online free tools that tell you that in seconds. Another option could be someone from the competition trying to hurts you in purpose. They can link to you from bad reputation sites using dofollow links. This could hurts. Check your income links in your search console maybe you can find out something suspicious.
Hope it helped
-
Of course, the domain is wandering-everywhere.com. I haven't used any black hat techniques to my knowledge, and don't believe that I have any spammy links. It's so low as it is and I'm really not sure what I could be doing more,
-
Any black hat techniques have been carried out on your domain? Have you checked on spammy links and used disavow tool?
Can you share the domain?
DA is a long term game, I've seen websites go down and then spike up. SEO is a black box and sometimes weird things without explanation happen.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
Questions About The Right Hosting
Hi All, I have a few questions about the right type of hosting that I should be using. I understand that many people say we should be using the best hosting that we can afford. However, when I have a website with just 650 pages / posts is it really worth worrying too much about where I am hosting. I am UK based so at the moment I am using a UK host along with a CDN. I have a unique IP address and on a server that has a limited amount of websites on it. The main question is there really any need to be looking at anything else. The truth is I have used cloud hosting before and the website loaded slower around the world with that than it does with my current setup. Thanks
Technical SEO | | TTGUK0 -
Have done everything for my site but not yet ranking in top 10 of critical keywords
I have conducted some highly technical SEO such as Link building, Social Media Promotion, On-page technical SEO, High quality Content, Superb Site architecture, responsive theme for good user experience, keyword analysis, title tags, description tags,etc. But currently not yet ranking even in top 5 of my critical keyword "Bulk sms kenya" or "bulk sms in kenya", my site is at http://goo.gl/X9vaLT am based in Nairobi, Kenya. Any advise, tips etc? As especially the keyword is not that highly competitive with a Difficulty Score of only 24%, what might I be missing?
Technical SEO | | ConnectMedia0 -
Is the sun online newspaper right with their home page
Hi, i made my home page shorter here www.in2town.co.uk because a couple of people on here said i should shorten my home page but i have noticed that the sun newspaper and other online newspapers have now made their home page longer, is this a good idea or a bad idea. Here is the sun newspaper http://www.thesun.co.uk/sol/homepage/ The reason why i done it in the first place was to allow as much info to be on the home page as possible, showing all the different sections on my site and allow google to see as well as our visitors how often the site gets updated. I would love to hear your thoughts on if we should make our home page longer again like the sun newspaper or if you think this is a bad idea.
Technical SEO | | ClaireH-1848860 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
How do I get google to index the right pages with the right key word?
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
Technical SEO | | ursalesguru0 -
Redirect everything from a certain url
I have a new domain (www.newdomain.com) and and an old domain (www.olddomain.com). Currently both domains are pointing (via dns nameserves) at the new site. I want to 301 everything that comes from the www.oldsite.com to www.newsite.com. I've used this htaccess code RewriteEngine On RewriteCond %{HTTP_HOST} !^www.newsite.com$
Technical SEO | | EclipseLegal
RewriteRule (.*) http://www.newsite.com/$1 [R=301,L] Which works fine and redirects if someone visits www.olddomain.com but I want it to cover everything from the old domain such as www.olddomain.com/archives/article1/ etc. So if any subpages etc are visited from the old domain its redirected to the new domain. Could someone point me in the right direction? Thanks0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0