Flat vs. subdomain web structure
-
I am building a site which sells a product in 50 states and in each state we will have independt partners. From an SEO perspective, what are the tradeoffs in using a single domain vs. having each state a subdomain? Each state also has varying regulatory issues that are specific to that state.
-
I agree that with 50 subdomains i cant see you having enouth content, i was speakng in general.
i was refereing to that link, Rand said it is his personal belief that most of the time it is better to keep to one subdomain.
-
I agree. When I use subdomains, I start thinking about FTP. I also think about the user having the best user experience. If he wants to make one site that markets 50 states, then using a CMS would be the answer. But creating 50 subdomains would be repetitive. I his case I would use folders and if the independent partner needs access to the site, then add them as a user with limited site access.
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
-
This is an old argument, subdomains v subfolders,
Matt cutts said there is no difference. see comments
http://www.mattcutts.com/blog/subdomains-and-subdirectories/google web master blog said there is no differnece.
http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.htmlRand recomends sub folders, but said it was his personal choice.
I have seen SERPS with sitelinks, and in theh site links are links in subdomains, so i would say that google sees them as the same site.
If you register a root domain in wmt, the links from subdomains are seen as internal links. If somone verifies the subdomain under another account, then you will no longer see stats for the subdomain.
i have never seen any evidence that they are any different.
-
Use craigslist.org as an example. Every city has it's own subdomain. It's not in a subfolder where link juice is passed. Using a subdomain is almost like having a different domain.
Your choices will be state.example.com or example.com/state. I personally would use subfolders instead of subdomains to keep link juice. No "if" I was going to GeoTarget each state and I did NOT want to be in other states, then I would use subdomains the way Craigslist is set up.
A better question is this. You state you want to sell "a" product in 50 states. The way I read that is you are going to have 50 pages of duplicate content (whether it's one product or 1,000 products). How do you mean independent partners? You have to explain that a little further. Do you mean affiliates? Do you mean independent contractors like MLMs (network marketing). Your website should be structured around your business objectives. What if you have two partners within one state?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
New Web Page Not Indexed
Quick question with probably a straightforward answer... We created a new page on our site 4 days ago, it was in fact a mini-site page though I don't think that makes a difference... To date, the page is not indexed and when I use 'Fetch as Google' in WT I get a 'Not Found' fetch status... I have also used the'Submit URL' in WT which seemed to work ok... We have even resorted to 'pinging' using Pinglar and Ping-O-Matic though we have done this cautiously! I know social media is probably the answer but we have been trying to hold back on that tactic as the page relates to a product that hasn't quite launched yet and we do not want to cause any issues with the vendor! That said, I think we might have to look at sharing the page socially unless anyone has any other ideas? Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Ecommerce SEO URL Structure Questions
| I am in the process of developing a new Magento ecommerce store. Take for instance this website is in the apparel industry and i have the following main categories. Clothing Shoes Accessories Beauty Sub categories for clothing would be: Dresses Pants jeans Tops Products would be: Kelly Maxi dresses What is the best SEO Structure for this? Main categories obviously: www.example.com/clothing Sub Categories:
Intermediate & Advanced SEO | | WayneRooney
www.example.com/clothing/dresses Or www.example.com/dresses (Zappos seem to pursue the second type) Products:
www.example.com/clothing/dresses/kelly-maxi-dresses/ Or www.example.com/kelly-maxi-dresses ? Which one would be the best way to structure your site? Also what about filters that available in category pages? Say if i were to filter by color. what would be the best URL? I am sure canonical tag is needed here. New to Ecommerce SEO so i need some guidance! |0 -
Sub Domains vs. Persistent URLs
I've always been under the assumption that when building a micro-site it was better to use a true path (e.g. yourcompany.com/microsite) URL as opposed to a sub domain (microsite.yourcompany.com) from an SEO perspective. Can you still generate significant SEO gains from a sub domain if you were forced to use it providing the primary (e.g. yourcompany.com) had a lot of link clout/authority? Meaning, if I had to go the sub domain route would it be the end of the world?
Intermediate & Advanced SEO | | VERBInteractive0 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
A Blog Structure Dilemma We're Facing...
We're launching a pretty large content program (in the form of a blog) and have a structure issue: Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog Here are the options: 1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers. 2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage. 3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe. User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning. Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2? (correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case) Many thanks for your inputs on this.
Intermediate & Advanced SEO | | SEOPA0 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0