SEO: Subdomain vs folders
-
Hello,
here's our situation:We have an ecommerce website, say www.example.com. For support, we use zendesk which offers a great platform that makes publishing FAQ and various resource articles very easy.
We're torn between publishing these articles on our actual website, or publishing them via Zendesk.
If we publish them on our website, the url would be something like:
www.example.com/articles/title_article.htmlOn the other hand, if we publish them via zendesk, the url would look like:
support.example.com/articles/title_of_articleWe would like to publish them via Zendesk, however, we do no want to miss out on any SEO benefit, however marginal it may be. Doing it this way, the domain would have all of the ecommerce pages (product and category pages), and the subdomain would have ALL other types of pages (help articles and also policies, such as return policies, shipping info, etc etc).
I know a long time ago, folders were preferred over subdomains for SEO, but things change all the time.Do you think setting up all the ecommerce pages on the domain, and all content pages on the subdomain, would be a lesser solution?
Thanks.
-
Thanks guys.
This is the answer I was hoping for.
-
some time ago Google equalised subdomains and subfolders - officially... the differences are marginal but they exist and I will tell you a few which you can think about ... maybe this will help your for your decision.
In the first place Google regards a subdomain as a separate domain within such a theoretical scenario: you can get up to 2 listings within the serps for one keyword with one domain. Using a subdomain multiplies this possibility (I`ve seen examples where a single subdomain had 5 listings + 2 from the main domain)... so this is a factor which can have an effect on the traffic.
Another aspect are penalties. If you get a penalty on a subdomain, there is a very big chance that the main domain is not affected. By the way: this process is NOT reversible!
Furthermore, if you use sudomains it`s much more easier to control htaccess (e.g. mod rewrite)
Using subfolders is often much more easier to handle for webmasters but that can`t be seen as "across-the-board".
The main argument you will get to hear about using subfolders is that they get trust from the main website. There is no evidence that this is an advantage of subfolders. Subfolders and subdomains need links from other sides to establish their ranking!
So if the content you would like to separate has less in common with the content of the main domain you should think about the option to use subdomains.
-
According to the graph available here http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html Google shows the re organization of external and internal links!
According to that Google is now considering sub domain as part of the website instead of considering as a separate domain.
Keeping that in mind, I believe subdomain will fine too!!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Benefit to SSL Certificate
Our site does not have an SSL certificate. I have read that in the process of adding one of URLs need to be redirected and that some link equity can be lost. Implementing an SSL certificate sounds somewhat complicated and far from risk free. Is there a tangible SEO benefit to upgrading to SSL? Will doing so help SEO in a tangible manner that justifies the cost, time and aggravation? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan12 -
Seo site architecture - how deep?
Hello Moz community! We are building out a site for a web hosting/web design company. I am wondering if we should just have home/categories/pages or if we should have home/categories/sub-categories/pages. I am am not sure if by adding the additional level we can create a bunch of mini-hubs within the categories. For example: Home/Web hosting/Business Web Hosting/Small Business Web Hosting I don't know if these mini-hubs within the category are a good idea or if I should keep it as flat as possible? Any thoughts on this?
Intermediate & Advanced SEO | | YouAndWhatArmy0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Seo black hat tricks
I have a competitor in the local area. He registered a new domain name. www.orangecountypatentlawfirm.com. It was created back in 11/10 and updated a few months ago on 11/13. My domain is ocpatentlawyer.com. I put my domain and his domain in the open site explorer. The peculiar thing is that my competitors website mirrors identically to my domain. (see attached image) my competitors website rose through the SERP very fast. I never saw it coming. Anyways, I wanted to know if he was using some type of black hat seo trick to hi jack my domain authority to get his own website to rank higher? Plus, if so, does it hurt my ranking? compare.png
Intermediate & Advanced SEO | | jamesjd710 -
SEO friendly blog.
i've read somewhere that if you list too many links/articles on one page, google doesn't crawl all of them. In fact, Google will only crawl up to 100 links/articles or so. Is that true? If so, how do I go about creating a page or blog that will be SEO friendly and capable of being completely crawled by google?
Intermediate & Advanced SEO | | greenfoxone0 -
Subdomains and SEO - Should we redirect to subfolder?
A new client has mainsite.com and a large numer of city specific sub domains i.e. albany.mainsite.com. I think that these subdomains would actually work better as subfolders i.e mainsite.com/albany rather than albany.mainsite.com. The majority of links on the subdomains link to the main site anyway i.e. mainsite.com/contactus rather than albany.mainsite.com/contactus. Having mostly main domain links on a subdomain doesnt seem like clever link architecture to me and maybe even spammy. Im not overly familiar with redirecting subdomains to subfolders. If we go the route of 301'ing subdomains to subfolders any advice/warnings?
Intermediate & Advanced SEO | | AndyMacLean0 -
Pros and Cons of new subdomain and redirecting old subdomain?
Hi guys, I am thinking about redirecting the ranking subdoman to a new subdomain which has my main keyword within it. I am trying to outrank an exact match competitor and don't seem to be able to do so. For example: page.example.com would be my existing ranking URL which is very powerful, and I am thinking about redirecting this to a new URL being keywordpage.example.com What are your thoughts on do this? Thanks B
Intermediate & Advanced SEO | | HaymarketMediaGroupLtd0 -
Slooooow motion SEO impact
I could do with some help if anyone's got a minute. We've got this one client, no matter what we did (and we worked very hard on this site), nothing would really move. You'd get the usual fluctuations, and maybe some very small progress at times. This went on for an age... much, much longer than usual (and it wasn't even that competitive for keywords). Then suddenly, "Bam!" it shot up like a rocket for all it's main keywords and has stayed there since... more or less (and this was over a year ago). It was as if all the work we'd been doing was building up behind a door and then the door flew open so it could take affect. Anyway... it seems to be happening again, just to a different client with a different website (at least I hope that's what's happening or it might just stay non-affected by anything we do forever). We've checked everything. There's no crawling problems, again it's not all that competitive, the site already has some pretty good trust and authority, and it already ranks well for a bunch of stuff. The site and pages have plenty of age behind them too. Any ideas?
Intermediate & Advanced SEO | | SteveOllington0