Should I use subdomains?
-
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site.
For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use
rose.flower.com
or
flower.com/roseFor SEO purposes, or usability, does it matter?
Thanks in advance.
-
No problem Gordon. This is just me providing an example which can be modelled next to any project, however big or small.
-
Thanks Gary. That's quite an extensive sitemap!
I'm not sure how much I had thought a sitemap through, but I certainly had a vague structure in mind, but you know how these "little projects" - they grow and grow into monsters!
-
Hi Gordon
Simon is certainly right.
Something else to consider...
If you considering building a site, I would certainly "first" get to work on designing a "site map" to help you clarify your site's purpose and goals. This is essential for both SEO and usability.
We did some content marketing recently for Red Funnel IoW Ferries and Red Funnel Holidays. You'll see we've split all of their content into a well organised site map. http://www.redfunnel.co.uk/information/sitemap/
You may already know this Gordon, but I wanted to point this out because so many new businesses build a site which does not cover their "customer needs". A site map will give you a clear picture to expand your content for your targeted audience, as well as the SEO benefits.
-
I have been watching how google treats subdomains for a long time.
My conclusion is that Google can not make up their mind how to treat them. Sometimes their power is united with the site. Sometimes they are isolated. When they are isolated all of the assets that you have placed on them have little value towards the success of your website.
So, if you want your web assets to consistently receive favorable treatment and pull in concert for your domain then they should all be placed in folders on the primary domain.
It is easy to be fooled by observing "how subdomains are being treated today" because google will likely change their mind tomorrow.
-
Thanks Simon. That was generally my opinion, mostly based on the fact that the majority of websites do it the way you say, but I thought I'd ask the question before I embark on the project.
-
Hi Gordon
Sub-domains in my experience are not the best way to dissect your content or your site with topics as varied as you are suggesting. The best method might be internal directories.
For Example: www.flower.com/roses/red.html and not - roses.flower.com/red-roses.html
I tried using sub domains when I first started out, for my own business and duplicate content issues together with so much additional updating when major changes were required made it a hassle that was not rewarded in ranking benefits.
I recently spent a month eliminating my sub-domains from search and found much more time available to focus on my main site. My rankings in local search flew to the top within a couple of weeks and organic rankings also benefited.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When Is It Okay To Use Bold, Underline & Italic Text? Should I Stay Away From My Keywords?
Hey guys I have a few questions. I am pretty sure that I was penalized by Panda a few years back because I went very heavy on bold, italic and underlining my keywords. Since then I removed the bold, italic and underlines and never have used them again. I was just reading an article on the Moz Blog and I saw some bold words. My questions are, When Is It Okay To Use Bold, Underline & Italic Text? Should I Stay Away From My Keywords? Any help would be great! Thank you.
Algorithm Updates | | Videogamefan1 -
What's the best way to go about building/using interactive snippets?
I'm starting to see interactive snippets (I guess they're called islands) like the attached image in our SERPs, so I figured I would look into experimenting with them, but I'm not entirely clear how to proceed. I have only seen them in adwords, so is that the only way you can use them? Is there some way to set them up or some service you need to set them up organically? Lost, but intrigued, Ruben SW7ak4d.jpg
Algorithm Updates | | KempRugeLawGroup0 -
Using a stop word when optimizing pages
I have a page (for a spa) I am trying to fully optimize and, using AdWords have run every conceivable configuration (using Exact Match) to ascertain the optimal phrase to use. Unfortunately, the term which has come up as the 'best' phrase is "spas in XXX" [xxx represents a location]. When reviewing the data, phrases such as "spas XXX" or "spa XXX" doesn't give me an appropriate search volume to warrant optimizing. So, with that said, do I optimize the page without the word "in", and 'hope' we get the search volume for searches using the word "in", or optimize using the stop word? Any thoughts? Thank you!
Algorithm Updates | | MarketingAgencyFlorida0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0 -
How much posting product links to Social Media affect your ranking? Any use ?
Google has Google Plus. Facebook has partnership with Bing. How much social media affect your ranking ?
Algorithm Updates | | rahijain0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Has anyone starting using schema.org?
On the 3rd June 2011 Google announced that they are going to start using Schema. Do you think this will change the way search engines find content, from briefly looking at Schema I'm concerned that the proposed tags could just turn into another keyword meta tag and be abused. Have you started using this tags yet and have you noticed a difference?
Algorithm Updates | | Seaward-Group0