Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Root directory vs. subdirectories
-
Hello.
How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory?
Thanks!
-
Howdy nyc-seo,
This is a really good question with lots of implications. Although there's no single "right" answer, there are a few things you might want to consider:
- Subfolders are good for organizational purposes and as such can help structure your content. For example, SEOmoz puts all the blog content under seomoz.org/blog
- Subfolders can contain keywords that help with CTR and possibly with rankings. This may be good in certain situations, like in ecommerce. i.e. example.com/bird-feeders/hummingbirds
- That said, shorter domains tend to perform better in search results, and you want to avoid keyword stuffing in your URLs.
- Also, too many subfolders and you can run into some crawling issues. In these cases, it's best to keep your site architecture as "flat" as possible, without too many additional layers of sub-directories.
Some additional resources that may help:
- http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
- http://www.seomoz.org/blog/site-architecture-for-seo
Hope this helps! Best of luck with your SEO.
-
Thank you.
-
Google Does consider pages more important in the root dir as compare to sub directories as far as i think!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Discrepancy in actual indexed pages vs search console
Hi support, I checked my search console. It said that 8344 pages from www.printcious.com/au/sitemap.xml are indexed by google. however, if i search for site:www.printcious.com/au it only returned me 79 results. See http://imgur.com/a/FUOY2 https://www.google.com/search?num=100&safe=off&biw=1366&bih=638&q=site%3Awww.printcious.com%2Fau&oq=site%3Awww.printcious.com%2Fau&gs_l=serp.3...109843.110225.0.110430.4.4.0.0.0.0.102.275.1j2.3.0....0...1c.1.64.serp..1.0.0.htlbSGrS8p8 Could you please advise why there is discrepancy? Thanks.
Technical SEO | | Printcious0 -
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Www vs non-www which is better?
Is it better to have all your pages point to the www version or non www version.
Technical SEO | | bronxpad0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Tutorial For Moving Blogger Blog From Sub-Domain to Sub-Directory
Does anyone know where I can find a tutorial for moving a blogger.com (blogspot) blog that's currently hosted on a subdomain (i.e. blog.mysite.com) to a subdirectory (i.e. mysite.com/blog) with the current version of blogger? I'm working on transferring my blogger blogs over to wordpress, and to do so without losing link juice or traffic, this is one of the steps I have to take. There's plenty of tutorials that address moving from blogspot.mysite.com to wordpress and I've even found a few that address moving from blog.mysite.com (hosted on blogger) to a root domain mysite.com. However, I need to move from blog.mysite.com (blogger) to mysite.com/blog/ - subdirectory (wordpress). Anyone who knows how to do this or can point me in the right direction?? Thanks.
Technical SEO | | ChaseH0 -
Microsite on subdomain vs. subdirectory
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances. What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
Technical SEO | | ryanwats0