Subdomain Research Tool
-
Does anybody know of a research tool that can track the amount of subdomains on a root domain?
Maybe there is a way to manipulate a Google search to display the different subdomains that are indexed?
-
Cool, looking forward to the ad-hoc query tool!
-
Thanks for the update Jon! Looking forward to the updates to come and to be able to manipulate the data in new ways. Should be exciting trying to figure new ways to explore things like subdomain counts.
Thanks again!
-
Hey guys!
So I spoke to the MozScape team and right now this is not possible. Although in theory we have the data to answer this question, due to the sheer size of the dataset the engineering teams have to make data structure and optimization decisions that favor certain use cases (e.g. 'show me all of the external followed links for a root domain'). Currently MozScape is not optimized to answer the use case 'show me all of the subdomains on a given root domain'.
However, you may know we are working on index updates that are going to change the way we store data - this is a huge project, but once it is completed we will be able to run ad-hoc queries against our data, and solve use cases like this.
Hope this helps!
Jon
-
This would be a very handy addition. As Michael has said, using search qualifiers is no quick (or absolute) solution. This would be a great tool for site audits as this was what I was looking for. It would also be useful for searching out other sites (that have been crawled by Moz) to quickly search out blog subdomains or other language subdomains.
Anyway, thanks for the response - looking forward to see if this becomes an available tool!
-
Thanks Michael - great idea! I could see this fitting in the MozBar, maybe in OSE. Let me do some digging into how our index is structured and I will get an update back up here on feasibility.
-
That's a good question...I've not seen a tool that magically does all of that, but certainly Moz could get that from the data they get when they crawl. I'll pass along that idea to Jon White.
I would use this tool myself during site audits, when I'm looking to see if the client's site has subdomains other than www that might be worth consolidating onto the www subdomain.
Today, I do it arduously with Google site: -inurl queries, e.g.
site:acme.com -inurl:www -inurl:blog
and then when I see a new subdomain appear, e.g. news.acme.com, then I append -inurl:news to the site: search.
This doesn't work if the client has decided that the www-less version of their domain is their preferred one...in this case, I'm totally SOL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Competitive Analysis Tools in robots.txt.... Worth it?
I've been considering blocking third party crawlers for a while – specifically those crawling my website for the sake of competitive analysis, such as SEMrush and Ahrefs. I'm familiar with how to do so, but when researching the question I found practically no one asking the same question. The guides I've found on what to put in your robots.txt make no mention of whether to block competitive analysis crawlers. Which makes me wonder whether this is a good idea after all. My chief concern here is rival sites going after the same search terms we target – one of our competitors in particular has an uncanny way of going after the same searches we are. I know blocking crawlers won't prevent competitors from watching our content, but it will make it slightly harder for them. Is there any major drawback I'm missing? Any big reason not to go ahead and block SEO analysis crawlers?
Competitive Research | | davidwaring0 -
Traffic Data for Competitor Subdomain
Does anyone know of a way to get monthly visitor data for a subdomain that I do not manage? I would normally use SEMRush or Compete, but they only provide domain level data.
Competitive Research | | dsinger0 -
Which analytics tools do you use?
If you are interested in web analytics, could you tell me, which tools (software) do you use? No just Google Analytics, but also some other tools (e.x: for heatmaps, facebook, internal statistics, ... etc) I´m interested also in testing (A/B, ..) - so, also these tools you can mention thanks 🙂
Competitive Research | | mysho0 -
Keyword difficulty/research question
Wondering if I could get some opinions from the fellow moz users' I have a website which I which to rank for the term 'evening dress'.As you can imagine it is a pretty difficult term with a score of 62% (the term gets 301,000 broad matches and 27,000 exact matches a month). As much as I would like to target this term I feel that my domain is not strong enough (DA 39) to match the competition. Therefore, would a better strategy be to target long tail keywords which also contain the primary keyword, ie black evening dress evening dress hire cheap evening dress buy evening dress online please note that these were just examples, I haven't researched a comprehensive long tail list. Would targeting these long tail keywords mean that a) I should be able to rank for them faster and thus receive more traffic, sooner, and b) build up links to the page which I ultimately want to rank for evening dresses with numerous backlinks containing the keyword evening dress. The trade off with doing this is that I would need to seo one page for all the long tail keywords to gain the maximum benefit for the 'money' keyword. Does this sound like a sensible approach to both ranking for big money term and also getting traffic sooner rather than later? Thanks Carl
Competitive Research | | Grumpy_Carl0 -
Tool for finding what keywords a competitor ranks for?
Does anyone know of any good tools that display what keywords a competitor ranks for? I have many competitors that I know get a lot of traffic, but I'm not entirely sure where the traffic comes from so it would be nice to plug in their url and get a general overview of what keywords they rank for and what positions.
Competitive Research | | shawn810 -
Subdomain Metrics
Hi All, I have been looking in the forums but could not quite find an appropriate answer to my question. I have a site that has no subdomains, in my htaccess non www redirects to www. So I would think that the subdomain metrics in seomoz would be the exact same for root and subdomain. However, it appears that I get almost all the Ticks for my site over my competition on the root domain, but do poorly in the sub domain area compared to competition. Also, it shows that I only have 2 links for everything in the sub domain section. Any I reading something wrong, or is this correct? Thanks in advance
Competitive Research | | cchhita1 -
How accurate is the Keyword Difficulty Tool for international markets (specifically Australia)?
The difficulty percentage on several keywords is identical for Google.com and Google.com.au and I am wondering why?
Competitive Research | | davidangotti0 -
How do you perform competitive research for SEO?
What metrics tell you the most when you're looking at your competitors across the search landscape? PageRank/MozRank Inbound links Keyword rankings Alexa/QuantCast/etc. Pages indexed Something else entirely? What numbers speak volumes to you when you want to get an idea of how you benchmark against your competitors? And how do you communicate these results?
Competitive Research | | jcolman2