Subdomain for every us state?
-
Hi,
one of our clients has an idea of making subdomains from his main website to sell his online advertisements in all states in USA.
f.e:
He wants to have a subdomain for every state and there to be information related only or mainly to this state?
I am not sure about is this a good idea? What is your opinion about it?
-
If the domain is has an extremely high authority (80+), I would consider it due to potential to dominate the SERPs by getting the www.web version and state.web version both to rank high.
Thanks, the domains authority is 43, it is not so high.
_Yes, stick with folders. It's much simpler and much better organization (states correct as Atlanta a city). _
Yes, it is true, thank you!
Next, I second everyone else, subfolders are much more organized, subfolders look better, and subdomains are an old SEO strategy (a little spammy, especially for a new domain)
It is an old domain (registered 1994). What guys you mean by better structured? Sorry if it is a simple question, I just want to be sure if it is what I think.
Also if a web site is in subdomain, does the main domain still pass some authority to the subdomain or not much?
-
I would stick with folders for two main reasons:
- First, yourdomain.com is part of your online brand. If every state's URL is statename.yourdomain.com, I think this takes away from your brand because the first thing people see when they look at the URL bar or in search results is a state name and not your URL
- Next, I second everyone else, subfolders are much more organized, subfolders look better, and subdomains are an old SEO strategy (a little spammy, especially for a new domain)
-
Yes, stick with folders. It's much simpler and much better organization (states correct as Atlanta a city).
-
subfolders are better than subdomains, it's all under one site instead of 59 (according to obama) individual subdomains.
that was an old school strategy when Google really gave push to having the keywords in the domain name and spammers took advantage of it to rank better by creating subdomains for each keyword phrase
-
I would advise against it.
I would stick them all in subfolders of the www version of the site. www.web.com/texas
If the domain is has an extremely high authority (80+), I would consider it due to potential to dominate the SERPs by getting the www.web version and state.web version both to rank high.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Similar content, targeting different states
I have read many answers regarding not having duplicated pages target different states (cities). Here is the problem. We have same content that will serve different pages in some provinces in Canada that we can't change allot intentionally. We don't want these pages compete within the same province. What would be the best approach not to get penalized and keep SERP? Initially we though about hreflang, but we can't really do it on the provice/state attributes. Thanks in advance!
Intermediate & Advanced SEO | | MSaffou20180 -
UK version of site showing US Cache and meta description
Hi Fellow Moz'ers We seem to have an issue where some of our UK site is showing meta descriptions from our US site in the serp's and when you check the cache: of the site it's brining up the .com instead of the .co.uk site. example: cache:https://www.tinyme.co.uk/name-labels shows the US site We've checked the href lang tags and they look ok to me (but i'm not an expert) https://www.tinyme.co.uk/name-labels" hreflang="en-gb"/> https://www.tinyme.com/name-labels" hreflang="en-us"/> https://www.tinyme.com.au/name-labels" hreflang="x-default" /> https://www.tinyme.com.au/name-labels" hreflang="en-au"/> We've had a search around and seen people have similar issues, but cant seem to find a definitive solution.
Intermediate & Advanced SEO | | tinyme1 -
301 Redirecting from domain to subdomain
We're taking on a redesign of our corporate site on our main domain. We also have a number of well established, product based subdomains. There are a number of content pages that currently live on the corporate site that rank well, and bring in a great deal of traffic, though we are considering placing 301 redirects in place to point that traffic to the appropriate pages on the subdomains. If redirected correctly, can we expect the SEO value of the content pages currently living on the corporate site to transfer to the subdomains, or will we be negatively impacting our SEO by transferring this content from one domain to multiple subdomains?
Intermediate & Advanced SEO | | Chris81980 -
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Blog - subdomain vs. subfolderq
Hi everyone I work on an ecommerce site and I'm trying to get more content together for the site & blog. The development team want to put the blog we have on a subdomain of our site, my question is - what is better for SEO Subfolder vs. subdomain I've read a couple of articles to say subfolder is better and a subdomain needs a lot of management to build up authority itself? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Using a US CDN (Cloudflare) for a UK Site. Should I use a UK Based CDN as it says my server is based in USA
Hi All, We are a UK Company with Uk customers only and use CloudFlare CND. Our Site is hosted by a UK company with servers here but from looking online and checking where my site is hosted etc etc , some sites are telling me the name of our UK Hosted company and other sites are telling me my site is hosted in San Fran (USA) , where I presume the Cloudflare is based. I know Cloudflare has a couple of servers in the UK it uses but given all my customers are UK based ,I don't want this is affect rankings etc , as I thought it was a ranking benefit to be hosted in the country you are based. Is there any issue with this and should I change or is google clever enough to know so i shouldn't worry. thanks Pet
Intermediate & Advanced SEO | | PeteC120 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0