Root directory vs. subdirectories
-
Hello.
How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory?
Thanks!
-
Howdy nyc-seo,
This is a really good question with lots of implications. Although there's no single "right" answer, there are a few things you might want to consider:
- Subfolders are good for organizational purposes and as such can help structure your content. For example, SEOmoz puts all the blog content under seomoz.org/blog
- Subfolders can contain keywords that help with CTR and possibly with rankings. This may be good in certain situations, like in ecommerce. i.e. example.com/bird-feeders/hummingbirds
- That said, shorter domains tend to perform better in search results, and you want to avoid keyword stuffing in your URLs.
- Also, too many subfolders and you can run into some crawling issues. In these cases, it's best to keep your site architecture as "flat" as possible, without too many additional layers of sub-directories.
Some additional resources that may help:
- http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
- http://www.seomoz.org/blog/site-architecture-for-seo
Hope this helps! Best of luck with your SEO.
-
Thank you.
-
Google Does consider pages more important in the root dir as compare to sub directories as far as i think!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
How are Server side redirects perceived compared to direct links (on a Directory site)
Hi, Im creating some listings for a client on a relevant b2b directory (a good quality directory) I asked if the links are 'followed' or no 'followed' and they said they are 'server side redirects' so no direct links. Does anyone know how these are likely to be perceived by Google ? All BEst Dan
Technical SEO | | Dan-Lawrence1 -
Https vs http two different domains?
If i visit mywebsite.com.au, www.mywebsite.com.au and http://www.mywebsite.com.au - i get one website BUT if I visit https://www.mywebsite.com.au I get a different website - I also get a untrusted website warning The logo in the bottom right of the https: website is the name of the webdesigner where the website is hosted. Is this a normal practice?
Technical SEO | | GardenBeet0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | | RanjeetP0 -
How to add business address in local directories for consistent NAP
Hi Mozers I keep puzzling over this one! I work from home and really don't want to plaster my address all over the web. The GP page now allows for me to hide my exact location, which is great. However, as far as I can see this is not the case with all the potential local directories and listings. I have been trying in to get around this by not adding my house number and last digit and 2 characters of my post code. So far this has been allowed by the local listings I have signed up with. When I tried doing as recommended by the excellent Miriam and checking my business name with 'Getlisted' I found that I could only see these local listings if I added the doctored address, i.e. no house number or full postcode. My question, finally is, if I continue in this fashion for businesses based at home addresses am I going to confuse the search engines. I want to provide a consistent NAP but GPP insists that I add a full postcode. The only way I could possible see around this is to add: street name city full postcode and omit the house name/number. Will this be a reasonable work around to maintain client confidentiality and satisfy the NAP requirement of Local search?
Technical SEO | | catherine-2793880 -
Rel=Canonical, WWW vs non WWW and SEO
Okay so I'm a bit of a loss here. For what ever reason just about every single Wordpress site I has will turn www.mysite.com into mysite.com in the browser bar. I assume this is the rel=canonical tag at work, there are no 301s on my site. When I use the Open Site Explorer and type in www.mysite.com it shows a domain authority of around 40 and a few hundred backlinks... and then I get the message. Oh Hey! It looks like that URL redirects to XXXXXX. Would you like to see data for <a class="clickable redirects">that URL instead</a>? So if I click to see this data instead I have less than half of that domain authority and about 2 backlinks. *** Does this make a difference SEO wise? Should my non WWW be redirecting to my WWW instead because that's where the domain authority and backlinks are? Why am I getting two different domain authority and backlink counts if they are essentially the same? Or am I wrong and all that link juice and authority passes just the same?
Technical SEO | | twilightofidols0 -
404 page for webshop vs 302 redirect
Hi everybody Im the owner of a webshop and we have implemented that products that are not instock are disabled from the shop. My problem is that i have a lot of 404 pages, that right now get redirected to the front page, when the item are not instock. This is because it would hurt the conversion rate if they got a standard 404 page. Customers dont know what a 404 and would click back and choose another competitor. Its really hard to find out what are the best solution and what are not a downrank at google. This has been running like this for 2 years and cant see any negative in the solution regarding seo and so on, What are your thoughts? Christian Hansen Denmark
Technical SEO | | noerdar0