Subdirectories vs subdomains
-
Hi SEO gurus
Anyone has input on what's better?
blog.domain.com vs domain.com/blog
store.domain.com vs domain.com/store
etc
I think the subdir (/xyz) will concentrate authority on the same subdomain so should be better? However sometimes it is tidier on the server to maintain online stores or blogs in a separate strucutre so subdomains work better in that sense. I just want to make sure that doesn't affect SEO?
Cheers!
-
Thanks for the responses Yumi, @oznappies & Aaron
-
Sub directories/folders used to be the definite way to go for the most SEO value however things have evened out for the most part. I think sub directories are more user friendly... we are used to the slash... extra periods can get tricky.
-
I've asked this one on here before and have some test sites trialling both ways and the sub-directory gains rank much faster from the root than the sub-domain.
-
in my opinion? definitely subdirectory. and in fact it will still be tidier on the server maintainability ..
For example please consider the following setup
you have your MYDOMAIN.com located at /var/www/html-mydomain
and by using a SUBDOMAIN.MYDOMAIN.COM, you will generally have them at /var/www/html-subdomain-mydomain
it looks tidier as you said it BUT you can achieve exactly the same thing using MOD_REWRITE
so perhaps right now your MYDOMAIN.com/SUB is located at /var/www/html-mydomain/SUB right? you can actually write a .htaccess that whenever user is going to MYDOMAIN.com/SUB it actually load the page under /var/www/html-subdomain-mydomain instead ! (so the server maintainability between using the subdirectory and subdomain will be exactly the same)
a much more technical example , put this inside your .htaccess of your root domain home directory
RewriteCond %{HTTP_HOST} ^[www\.]*sub-domain-name.domain-name.com [NC] RewriteCond %{REQUEST_URI} !^/sub-domain-directory/.* RewriteRule ^(.*) /sub-domain-directory/$1 [L]
This will cause the following situation
when people go to http://sub-domain-name.domain-name.com/a.html AND http://domain-name.com/sub-domain-directory/a.html
both of them will open your /sub-domain-directory/a.html instead
hope this help
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WWW vs Non WWW for EXISTING site.
This one has sort of been asked already but I cannot find an answer. When we evaluate a new SEO client, previously with Majestic we would review the root domain vs sub domain (www) for which had the higher Trust Flow and Citation flow, and if there was a major difference, adjust the Google indexed domain to the higher peforming one. Is there a way to do this with Moz, Domain Authority, and Sub Domain authority are always returning the same DA for me. Thanks in advance.
Technical SEO | | practiceedge10 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
Which is better Title length vs. keywords?
We run a jobboard. The title tag on a page for a job is often over 70 characters. An example of one would be: " Supplier Quality Inspector (Electrical Manufacturing) Job in Orlando, FL 32809 at Pro Image Solutions | Orlando Jobs!" The company name 'Orlando Jobs!" comes at the end but is also a really good keyword e.g. 'Orlando' and 'Jobs' I am interested in suggestions as to how to make these titles better. For example take off the company name when we go over 70 characters? Move the company name to the front of the title because the company name is also good keywords? I am looking for the best way to handle the issue is all. Thanks.
Technical SEO | | JobBiz0 -
Affects of multiple subdomains on homebrew CDN for images
We're creating our own CDN such that instead of serving images from http://mydomain.com/images/shoe.jpg It will appear at all of the following subdomains: http://cdn1.mydomain.com/images/shoe.jpg http://cdn2.mydomain.com/images/shoe.jpg http://cdn3.mydomain.com/images/shoe.jpg http://cdn4.mydomain.com/images/shoe.jpg Image tags on our pages will randomly choose any subdomain for the src. The thought was this will make page loading faster by paralellizing requests across many cookie-less domains. How does this affect : -Ranking of images on Google image search. -Ranking of pages they appear on -Domain authority (images are linked to heavily in our social media efforts, so we will 301 redirect image urls to cdn1.mydomain.com) Should we disallow all but one CDN domain in robots.txt? Will robots.txt on an image only subdomain even be retrieved? Should we just use 1 CDN subdomain instead?
Technical SEO | | cat5com0 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
Blog archives vs individual articles
In a client's blog, you can find each individual article pages as well as aggregate of articles per month or sometimes per day (including each entire article). The problem is that the article appears twice, once in a dedicated page (article page) and once with other articles (in the archive). Is there a specific SEO approach to this type of situation? Is there duplicate content? What page name should I give each archive (if at all), as there are quite a few? Thank you
Technical SEO | | DavidSpivac0