When "pruning" old content, is it normal to see an drop in Domain Authority on Moz crawl report?
-
After reading several posts about the benefits of pruning old, irrelevant content, I went through a content audit exercise to kick off the year. The biggest category of changes so far has been to noindex + remove from sitemap a number of blog posts from 2015/2016 (which were very time-specific, i.e. software release details).
I assigned many of the old posts a new canonical URL pointing to the parent category. I realize it'd be ideal to point to a more relevant/current blog post, but could this be where I've gone wrong?
Another big change was to hide the old posts from the archive pages on the blog.
Any advice/experience from anyone doing something similar much appreciated! Would be good to be reassured I'm on the right track and a slight drop is nothing to worry about.
If anyone is interested in having a look:
- https://vivaldi.com
- https://vivaldi.com/blog/snapshots [this is the category where changes have been made, primarily]
- https://vivaldi.com/blog/snapshots/keyboard-shortcut-editing/ [example of a pruned post]
-
Someone from Moz will probably give better insights but the DA metric may not be impacted by your change but rather the update to the tracking index from Moz. If you read into the DA metric they provide I believe it is relative to their index and will fluctuate based on data they hold - primarily about links and site quality as a whole.
I'd be surprised if pruning your content put you in a place where DA was negatively impacted as a direct result.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub Directories Domain & Page Crawl Depth
Hi, I just bought an old domain with good backlinks and authority, that domain was technology product formerly. So, I want to make this domain for my money site. The purpose of this website is to serve technological information like WordPress tutorial and etc (free software or drivers). And I just installed a sub directory on this domain like https://maindomain.com/subdirectory/ and this directory I made for a free software like graphics drivers download (NVIDIA or AMD). What you think with this website? Is it make sense? Wait, I just added this domain to my campaign at MOZ and the result shown my sub directory was 6 times of crawl depth. Is it good for directory or I need to move the sub directory to my main site? Thank you, hope someone answer my confuse. Best Regard, Matthew.
Intermediate & Advanced SEO | | matthewparkman0 -
Found a cache of old domain names, should I link or 301 redirect
We have found a cache of about 10 URLs, some are ranking above our main URL in Google SERPS. What is the best course of action here? a. Redirect all to the homepage?
Intermediate & Advanced SEO | | moconn
b. Link all domains to the homepage?
c. Link all domains to select pages on on main site, being careful not to anchor text spam
d. 301 redirect all to the main site. Is there any disadvantage to your recommendation? Is there likely to be a penalty incurred? I feel like we'll get the strongest increase in rankings by following option c but it feels like option d may be safer. Thanks in advance for your help!0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a  SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the  SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Root domain authority or page authority - which matters more
When does one matter more than the other? Any help would be greatly appreciated. Thanks! Matthew
Intermediate & Advanced SEO | | Mrupp440 -
I'm not sure why SEOMoz is reporting duplicate content
I have thousands of duplicate page content errors on my site, but I'm not exactly sure why. For example, the crawl is reporting this page - http://www.fantasytoyland.com/2011-glee-costumes.html is a duplicate of this page - http://www.fantasytoyland.com/2011-jersey-shore-costumes.html . All of these products are unique to the page - what is causing it to flag as duplicate content?
Intermediate & Advanced SEO | | FutureMemoriesInc0 -
Where do I redirect a domain to strengthen another domain?
I've got a UK domain that I need to redirect to a US domain. Should I point it to the root domain or a landing page off the root and what it the benefit to doing one over the other?
Intermediate & Advanced SEO | | JCorp0 -
Redirecting One Page of Content on Domain A to Domain B
Let's say I have a nice page of content on Domain A, which is a strong domain. That page has a nice number of links from other websites and ranks on the first page of the SERPs for some good keywords. However, I would like to move that single page of content to Domain B using a 301 redirect. Domain B is a slightly weaker domain, however, it has better assets to monetize the traffic that visits this page of content. I expect that the rankings might slip down a few places but I am hoping that I will at least keep some of the credit for the inbound links from other websites. Has anyone ever done this? Did it work as you expected? Did the content hold its rankings after being moved? Any advice or philosophical opinions on this? Thank you!
Intermediate & Advanced SEO | | EGOL2