Should I use subdomains?
-
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site.
For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use
rose.flower.com
or
flower.com/roseFor SEO purposes, or usability, does it matter?
Thanks in advance.
-
No problem Gordon. This is just me providing an example which can be modelled next to any project, however big or small.
-
Thanks Gary. That's quite an extensive sitemap!
I'm not sure how much I had thought a sitemap through, but I certainly had a vague structure in mind, but you know how these "little projects" - they grow and grow into monsters!
-
Hi Gordon
Simon is certainly right.
Something else to consider...
If you considering building a site, I would certainly "first" get to work on designing a "site map" to help you clarify your site's purpose and goals. This is essential for both SEO and usability.
We did some content marketing recently for Red Funnel IoW Ferries and Red Funnel Holidays. You'll see we've split all of their content into a well organised site map. http://www.redfunnel.co.uk/information/sitemap/
You may already know this Gordon, but I wanted to point this out because so many new businesses build a site which does not cover their "customer needs". A site map will give you a clear picture to expand your content for your targeted audience, as well as the SEO benefits.
-
I have been watching how google treats subdomains for a long time.
My conclusion is that Google can not make up their mind how to treat them. Sometimes their power is united with the site. Sometimes they are isolated. When they are isolated all of the assets that you have placed on them have little value towards the success of your website.
So, if you want your web assets to consistently receive favorable treatment and pull in concert for your domain then they should all be placed in folders on the primary domain.
It is easy to be fooled by observing "how subdomains are being treated today" because google will likely change their mind tomorrow.
-
Thanks Simon. That was generally my opinion, mostly based on the fact that the majority of websites do it the way you say, but I thought I'd ask the question before I embark on the project.
-
Hi Gordon
Sub-domains in my experience are not the best way to dissect your content or your site with topics as varied as you are suggesting. The best method might be internal directories.
For Example: www.flower.com/roses/red.html and not - roses.flower.com/red-roses.html
I tried using sub domains when I first started out, for my own business and duplicate content issues together with so much additional updating when major changes were required made it a hassle that was not rewarded in ranking benefits.
I recently spent a month eliminating my sub-domains from search and found much more time available to focus on my main site. My rankings in local search flew to the top within a couple of weeks and organic rankings also benefited.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the feeliing of "Here's where our site can help" text links used for conversions?
If you have an ecommerce site that is using editorial content on topics related to the site's business model to build organic traffic and draw visitors who might be interested in using the site's services eventually, what is the SEO (page ranking) impact -- as well as the impact on the visitors' perceptions about the reliability of the information on the site -- of using phrases like "Here is where [our site] can help you." in nearly every article. Note: the "our site" text would be linked in each case as a conversion point to one of the site's services pages to get visitors to move from content pages on a site to the sales pages on the site. Will this have an impact on page rankings? Does it dilute the page's relevance to search engines? Will the content look less authoritative because of the prevalence of these types of links? What about the same conversion links without the "we can help" text - i.e., more natural-sounding links that stem from the flow of the article but can lead interested visitors deeper into the ecommerce section of the site?
Algorithm Updates | | Will-McDermott0 -
What date tags are required/best to use for articles or other landing pages?
Does anyone have suggestions on which date tag(s) are most important to use and how to use them on the frontend? (i.e. dateModified, dateCreated, and datePublished). The Structured Data Testing Tool is coming up with errors for my article pages, but I'm a bit confused which ones should be in the code vs. showing on the frontend.
Algorithm Updates | | ElsaT0 -
When Is It Okay To Use Bold, Underline & Italic Text? Should I Stay Away From My Keywords?
Hey guys I have a few questions. I am pretty sure that I was penalized by Panda a few years back because I went very heavy on bold, italic and underlining my keywords. Since then I removed the bold, italic and underlines and never have used them again. I was just reading an article on the Moz Blog and I saw some bold words. My questions are, When Is It Okay To Use Bold, Underline & Italic Text? Should I Stay Away From My Keywords? Any help would be great! Thank you.
Algorithm Updates | | Videogamefan1 -
Should I use the Disavow Tool at this point?
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us. My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet? I have done everything possible to get them removed, and it's not happening.
Algorithm Updates | | UnderRugSwept0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Search bots that use referrers?
Can someone point me to a list or just tell me specific search bots that use referrers?
Algorithm Updates | | BostonWright0 -
Using Brand Name in Page titles
Is it a good practice to append our brand name at the end of every page title? We have a very strong brand name but it is also long. Right now what we are doing is saying: Product Name | Long brand name here Product Category | Long brand name here Is this the right way to do it or should we just be going with ONLY the product and category names in our page titles? Right now we often exceed the 70 character recommendation limit.
Algorithm Updates | | mlentner1