Subdomain for every us state?
-
Hi,
one of our clients has an idea of making subdomains from his main website to sell his online advertisements in all states in USA.
f.e:
He wants to have a subdomain for every state and there to be information related only or mainly to this state?
I am not sure about is this a good idea? What is your opinion about it?
-
If the domain is has an extremely high authority (80+), I would consider it due to potential to dominate the SERPs by getting the www.web version and state.web version both to rank high.
Thanks, the domains authority is 43, it is not so high.
_Yes, stick with folders. It's much simpler and much better organization (states correct as Atlanta a city). _
Yes, it is true, thank you!
Next, I second everyone else, subfolders are much more organized, subfolders look better, and subdomains are an old SEO strategy (a little spammy, especially for a new domain)
It is an old domain (registered 1994). What guys you mean by better structured? Sorry if it is a simple question, I just want to be sure if it is what I think.
Also if a web site is in subdomain, does the main domain still pass some authority to the subdomain or not much?
-
I would stick with folders for two main reasons:
- First, yourdomain.com is part of your online brand. If every state's URL is statename.yourdomain.com, I think this takes away from your brand because the first thing people see when they look at the URL bar or in search results is a state name and not your URL
- Next, I second everyone else, subfolders are much more organized, subfolders look better, and subdomains are an old SEO strategy (a little spammy, especially for a new domain)
-
Yes, stick with folders. It's much simpler and much better organization (states correct as Atlanta a city).
-
subfolders are better than subdomains, it's all under one site instead of 59 (according to obama) individual subdomains.
that was an old school strategy when Google really gave push to having the keywords in the domain name and spammers took advantage of it to rank better by creating subdomains for each keyword phrase
-
I would advise against it.
I would stick them all in subfolders of the www version of the site. www.web.com/texas
If the domain is has an extremely high authority (80+), I would consider it due to potential to dominate the SERPs by getting the www.web version and state.web version both to rank high.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to take out international URL from google US index/hreflang help
Hi Moz Community, Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website. The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage. I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search. Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue. Thanks,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Keywords going to Subdomain instead of targeted page(general landing page)
Why are some of my keywords going to subdomains instead of the more general/targeted landing page. For example, on my ecommerce website, the keyword 'tempurpedic' is directing to the subdomain URL of a specific tempurpedic product page instead of the general landing page. The product has a page authority of 15 and the Tempurpedic landing pages with all the products has an authority of 31. I have also noticed that my 'furniture stores in houston' keyword directs to my "occasional tables" URL! instead of a the much more targeted homepage. Is there something I am missing here?
Intermediate & Advanced SEO | | nat88han0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
Subdomain Metrics Links??
I have been analysing my companies website against our competitors and we beat them hands down on everything apart from the total links in the subdomain metrics. Our competitor jumped above us a couple of months ago to grab the number one spot for our industries most valuable keyword. They have had a new website designed and after looking at the source code and running it through SEO MOZ in comparison to our site I can't see how they have manged to do it. We beat them hands down on all factors apart from subdomain metrics > Total links where they have twice as many. When it comes to Page Specific Metrics and Root Domain Metrics we easily beat them on all factors. Does anyone have any ideas what I need to do to improve the subdomain metrics? Thanks
Intermediate & Advanced SEO | | Detectamet0 -
Large volume of ning files in subdomain - hurting or helping?
I have a client that has 600 pages in their root domain and a subdomain that contains 7500 pages of un-seoable Ning pages. PLUS another 650 pages from Sched.com that also is contributing to a large volume of errors. My question is - should I create a new domain for the Ning content - or am I better off with the volume of pages - even if they have loads of errors? Thanks!
Intermediate & Advanced SEO | | robertdonnell0 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0 -
How to get subdomains to rank well?
Hi All, I am setting up a new site and I want to make use of subdomains to target multiple countries as follows: uk.mydomain.com us.mydomain.com australia.mydomain.com etc. Now i know what you're all going to say, why not use folders as they are more effective. Well I did think of this but decided against it because I would like to make the best of a low competition industry. I want to push my competitors as far down in the SE's as possible and i plan to do this by targeting generic non locational search terms with both sites so I can hog the top 4 spots.as follows: www.mydomain.com www.mydomain.com/keyterm uk.mydomain.com uk.mydomain.com/keyterm-in-the-UK Whats steps can I take to ensure rank passes to my subdomains? Is it better to start the site with folders like www.mydomain.com/us/keyterm and then 301 them to subdomains at a later stage or should i start with the subdomains?
Intermediate & Advanced SEO | | Mulith1 -
Subdomains vs. Subfolders for unique categories & topics
Hello, We are in the process of redesigning and migrating 5 previously separate websites (all different niche topics, including dining, entertainment, retail, real estate, etc.) under one umbrella site for the property in which they exist. From the property homepage, you will now be able to access all of the individual category sites within. As each niche microsite will be focused on a different topic, I am wondering whether it is best for SEO that we use subdomains such as category.mainsite.com or subfolders mainsite.com/category. I have seen it done both ways on large corporate sites (ie: Ikea uses subdomains for different country sites, and Apple uses subfolders), so I am wondering what makes the most sense for this particular umbrella site. Any help is greatly appreciated. Thanks, Melissa
Intermediate & Advanced SEO | | grapevinemktg0