Flat vs. subdomain web structure
-
I am building a site which sells a product in 50 states and in each state we will have independt partners. From an SEO perspective, what are the tradeoffs in using a single domain vs. having each state a subdomain? Each state also has varying regulatory issues that are specific to that state.
-
I agree that with 50 subdomains i cant see you having enouth content, i was speakng in general.
i was refereing to that link, Rand said it is his personal belief that most of the time it is better to keep to one subdomain.
-
I agree. When I use subdomains, I start thinking about FTP. I also think about the user having the best user experience. If he wants to make one site that markets 50 states, then using a CMS would be the answer. But creating 50 subdomains would be repetitive. I his case I would use folders and if the independent partner needs access to the site, then add them as a user with limited site access.
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
-
This is an old argument, subdomains v subfolders,
Matt cutts said there is no difference. see comments
http://www.mattcutts.com/blog/subdomains-and-subdirectories/google web master blog said there is no differnece.
http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.htmlRand recomends sub folders, but said it was his personal choice.
I have seen SERPS with sitelinks, and in theh site links are links in subdomains, so i would say that google sees them as the same site.
If you register a root domain in wmt, the links from subdomains are seen as internal links. If somone verifies the subdomain under another account, then you will no longer see stats for the subdomain.
i have never seen any evidence that they are any different.
-
Use craigslist.org as an example. Every city has it's own subdomain. It's not in a subfolder where link juice is passed. Using a subdomain is almost like having a different domain.
Your choices will be state.example.com or example.com/state. I personally would use subfolders instead of subdomains to keep link juice. No "if" I was going to GeoTarget each state and I did NOT want to be in other states, then I would use subdomains the way Craigslist is set up.
A better question is this. You state you want to sell "a" product in 50 states. The way I read that is you are going to have 50 pages of duplicate content (whether it's one product or 1,000 products). How do you mean independent partners? You have to explain that a little further. Do you mean affiliates? Do you mean independent contractors like MLMs (network marketing). Your website should be structured around your business objectives. What if you have two partners within one state?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Structure June 2019
Question Which link structure is better in 2019 for best SEO practice Example A) https://www.fishingtackleshop.com.au/soft-plastic-lures/ Or B) https://www.fishingtackleshop.com.au/fishing/fishing-lures/soft-plastic-lures/ We're on the bigcommerce platform and used to have https://www.fishingtackleshop.com.au/categories/soft-plastic-lures/ Last year we went from bigcommerce long URL to short to bypass the link juice being sent to /categories Now we have an SEO company trying to sell me their services after a bit of a steady decline since september 2018 and told me that we should have link structure as example B and that is likely the reason for the dip.. Due to breadcrumbing, True or False?
Intermediate & Advanced SEO | | oceanstorm
I explained i had bread crumb like shown in https://www.fishingtackleshop.com.au/berkley-powerbait-t-tail-minnow/ buy the SEO guy said no it needs to be in the URL structure too. I was under the impression that Short urls opposed to long was better these days and link juice is passed better if it is short url direct to the point? Am i wrong?1 -
Best way to structure urls wordpress and Yoast?
I am using Wordpress and Yoast. I have Parent pages and child pages. Yoast recommends you have the keyword in the url. For the parent page I have the city name in the url. Question is, should the child pages also have the city name in the url or would that be considered keyword stuffing? Here is the current structure. http://forestparkdental.info/st-louis-dental-services/restorative-dentistry/inlays-and-onlays So didn't know if should have the end of that url as /restorative-dentistry-st-louis /inlays-and-onlays-st louis since those are separate pages and Yoast and Moz plugin doesn't give you the Green light in in all areas unless you do it like this? Thanks Scott
Intermediate & Advanced SEO | | scott3150 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Site structure from an SEO standpoint
I am fortunate enough to be working with a client who is still building their website. From a site structure standpoint, what can I look for with my SEO hat as they build their wire frames and storyboard their site? I want to make sure I don't miss any components that might be helpful short and long term
Intermediate & Advanced SEO | | StreetwiseReports0 -
How should i best structure my internal links?
I am new to SEO and looking to employ a logical but effective internal link strategy. Any easy ways to keep track of what page links to what page? I am a little confused regarding anchor text in as much as how I should use this. e.g. for a category page "Towels", I was going to link this to another page we want to build PA for such as "Bath Sheets". What should I put in for anchor text? keep it simple and just put "Bath Sheets" or make it more direct like "Buy Bath Sheets". Should I also vary anchor text if i have another 10 pages internally linking to this or keep it the same. Any advise would be really helpful. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Site structure question
Hello Everyone, I have a question regarding site structure and I would like to mastermind it with everyone. So I am optimizing a website for a Ford Dealership in Boston, MA. The way the site architecture is set up is as follows: Home >>>> New Inventory >>> Inventory Page (with search refinement choices) After you refine your search (lets say we choose a Ford F150 in white) it shows a page with images, price information and specs. (Nothing the bots or users can sink their teeth into) My thoughts are to create category pages for each Ford model with awesome written content and THEN link to the inventory pages. So it would look like this: Home >>> New Inventory >>> Ford 150 Awesome Category Page>>>>Ford F150 Inventory Page I would work hard at getting these category pages to rank for the vehicle for our GEO targeted locations. Here is my questions: Would you be annoyed to first land on a category page with lots of written text, reviews images and videos first and then link off to the inventory page. Or would you prefer to go right from the new inventory page to the actual inventory page and start looking for vehicles? Thanks you so much, Bill
Intermediate & Advanced SEO | | wparlaman0 -
URL Structure - Keywords vs. Information Architecture/Navigation
I'm creating the URL structure for an ecommerce site and was wondering if it's better to structure my URLs according to the most popular way people word their key phrases or by what makes most sense from a navigation perspective. Let's say I'm selling clothing (I'm not, just an example). I want the site to be open enough so a user can navigate by Person Type (Men's, Women's, Children's), Clothing Type (Shoes, Shirts, Hats), and Brands (Nike, Reebok, adidas). My gut and past experience say to structure the URLs from the least specific to the most specific: mysite.com/mens/shoes/nike But I know "men's Nike shoes" is searched for more than "men's shoes Nike", which would render this URL: mysite.com/mens/nike/shoes I know mysite.com/mens-nike-shoes would be best, but the folders setup is what I have to work with. So which is best for SEO? URLs that play to the structure of the most searched for key phrases? Or URLs that follow the information architecture/navigation of a site? Nate
Intermediate & Advanced SEO | | rball10