Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why some domains and sub-domains have same DA, but some others don't?
-
Hi
I noticed for some blog providers in my country, which provide a sub-domian address for their blogs. the sub-domain authority is exactly as the main domain. Whereas, for some other blog providers every subdomain has its different and lower authority.
for example "ffff.blog.ir" and "blog.ir" both have domain authority of 60. It noteworthy to mention that the "ffff.blog.ir" does not even exist!
This is while mihanblog.com and hfilm.mihanblog.com has diffrent page authority.
-
Hey!
DA scores are specific to the root domain, we are not taking into account a subdomain. So even if you search a subdomain that doesn't exist (ffff.blog.ir), DA score is still only relevant to the root domain (blog.ir) which does exist.
Page Authority on the other hand is specific to the exact page you are searching, so it makes sense that mihanblog.com and hfilm.mihanblog.com would have different page authority scores are they are separate pages.
Hope that helps, let me know if you have further questions.
-
Am pretty sure that Moz just unifies DA stats in some circumstances when they have no 'actual' data for the subdomain in question. Sites which are very important, which have a number of sub-domains (but not hundreds or thousands) often render different results for DA metrics. For some blog platforms (blogger is a good example, also "blogname.wordpress.com" WordPress hosted blogs) - they have thousands or hundreds of thousands of sub-domains and Moz can't index all of them
In these situations, if Moz comes across a subdomain which is well-linked across the web it will separate it out and ascribe unique values. For the rest (of which Moz holds no data), it probably just unifies the DA metric with the root domain
It's a symptom of an incomplete index of the web. That being said, no one has a complete index of the web - you have to work with what you got
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can subdomains hurt your primary domain's SEO?
Our primary website https://domain.com has a subdomain https://subDomain.domain.com and on that subdomain we have a jive-hosted community, with a few links to and fro. In GA they are set up as different properties but there are many SEO issues in the jive-hosted site, in which many different people can create content, delete content, comment, etc. There are issues related to how jive structures content, broken links, etc. My question is this: Aside from the SEO issues with the subdomain, can the performance of that subdomain negatively impact the SEO performance and rank of the primary domain? I've heard and read conflicting reports about this and it would be nice to hear from the MOZ community about options to resolve such issues if they exist. Thanks.
Intermediate & Advanced SEO | | BHeffernan1 -
Should I redirect a domain we control but which has been labeled 'toxic' or just shut it down?
Hi Mozzers: We recently launched a site for a client which involved bringing in and redirecting content which formerly had been hosted on different domains. One of these domains still existed and we have yet to bring over the content from it. It has also been flagged as a suspicious/toxic backlink source to our new domain. Would I be wise to redirect this old domain or should I just shut it down? None of the pages seem to have particular equity as link sources. Part of me is asking myself 'Why would we redirect a domain deemed toxic, why not just shut it down.' Thanks in advance, dave
Intermediate & Advanced SEO | | Daaveey0 -
SEO implications of moving fra a sub-folder to a root domain
I am considering a restructure of my site, and was hoping for some input on SEO implications which I am having some issues getting clarity in. (I will be using sample domains/urls because of language reasons, not an english site), Thinking about moving a site (all content) from example.com/parenting -> parenting.com. This is to have a site fully devoted to this theme, and more easily monitor and improve SEO performance on this content alone. Today all stats on external links, DA etc is related to the root domain, and not just this sub-department. Plus it would be a better brand-experience of the content and site. Other info/issues: -The domain parenting.com (used as example) is currently redirected to example.com/parenting. So I would have to reverse that redirect, and would also redirect all articles to the new site. The current domain example.com has a high DA (67), but the new domain parenting.com has a much lower DA (24). Question: Would the parenting.com domain improve it's DA when not redirected and the sub-folder on the high-DA domain is redirected here instead? Would it severly hurt SEO traffic to make this change, and if so is there a strategy to make the move with as little loss in traffic as possible? How much value is in having a stand-alone domain, which also is one of the most important keywords for this theme? My doubt comes mostly from moving from a domain with high DA to a domain with much lower DA, and I am not sure about how removing the redirect would change that, or if placing a new redirect from the subfolder on the current site would help improve it. Would some DA flow over with a 301 redirect? Thanks for any advice or hints to other documentation that might be of interest for this scenario 🙂
Intermediate & Advanced SEO | | Magne_Vidnes0 -
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
My website hasn't been cached for over a month. Can anyone tell me why?
I have been working on an eCommerce site www.fuchia.co.uk. I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there. The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl. Can anyone tell me or suggest why my site hasn't been indexed in such a long time? Thanks
Intermediate & Advanced SEO | | SEOAndy0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0