Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using subdomains for related landing pages?
-
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
-
Thanks very much Jane! I think subdirectories are how I'll go.
Effective; organic SEO is HUGE for my initial online success. We market only with direct mail so far. But mailing lists don't address human situations ie: people who've inherited a property AND with it a 2nd mortgage payment AND they're stressed because they can't afford the 2nd payment AND their realtor hasn't sold the inherited property.One last question for all-
With effective landing page SEO & SERP being my primary goal; is the web URL structure term "siloing" familiar to anyone and applicable / adaptable to my multiple landing pages? (I found the term & explanation here: http://www.bruceclay.com/seo/silo.htm) Or is some other method more advisible in order to "pool" my subdirectories for better SEO in SERP? Peter
-
Hi Peter,
In some ways, subdirectories seems even more sensible when you're dealing with single landing pages, as they'll work together somewhat to look like a fuller site from Google's perspective, rather than just a collection of subdomains happening to exist on the same domain.
-
Hello again; after looking at your feedback; then a fresh look at our marketing needs & budget... After viewing each of our competitors sites with keyword 'semi'-stuffing, empty tags, horrible SEO structure, very light traffic & way too much info.... So now we're thinking that we do not need a main site; AND JUST HAVE multiple landing pages each very focused on a single financial or situational motivation causing a property owner to want to sell quickly & we'll explain how we are an alternative than a realtor. Does using subdirectories still seem best for only having single page landing pages? Does anyone have a few informative links regarding setting up & use of subdirectories? Thx, Peter
-
Hi Peter,
I understand that the platform only allows for subdomains. From a purely SEO point of view, subfolders or pages are preferable to subdomains because authority does not appear to pass between a parent domain and its subdomains in the same way as it does between subfolders and parent domains. If your landing page sites are only one pagers, they may be seen as quite thin as well.
However, there is no reason why you can't build quality content like this - it just may take more link building to establish the authority for the subdomains than it would for pages on the same site. You will need to ensure that as much unique content as possible is placed on the landing pages to increase their 'worth' in Google's eyes, given that they are separate from each other on subdomains.
-
Thanks for both responses. Alan- These landing pages would be single page sites. Thompson Paul- The reason I thought sub-domains IS TO SAVE $ with Lander ($ per # domains) and the cost of registering many domains.
Here's the specifics of my search.. The targeted property owner mailing lists are based on data: mortgage, taxes & assessors. They give NO CLUES as to human condition that we look for when our mailers get responses.We have a list of motivations (or reasons for distress to sell their house) are financial or circumstantial: divorce, inheritance, job loss, job transfer, can't sell house, bankrupt, tenant trashed apartment, etc. These motivations are not apparent, obviously, on a mailing list. We want to learn the best way to specifically find people, who own their property in CT, who aren't searching to sell - but are looking for solution to divorce or whatever NOT realizing a cash buyer (us) is a real & UPRIGHT solution. ** We have a list of motivations that we want to define into what phrases people ask in Google to find answers; then what keywords get found for those queries.. and limit it the best we can to CT.** Thanks, Peter
PS:Like Squarespace is drag and drop creation for websites plus hosting, ecommerce & stats; so is www.landerapp.com to landing pages -- they offer customize-able templates that are SEO optimize-able, have great stats & offer drag & drop opt-in forms to integrate into my email service. Comments/advice?
-
Fully agree with Alan - subdomains would be a major waste of effort and SEO value.
Are you thinking you want subdomains perhaps so you can track them differently? There are many ways to do the necessary tracking with pages in subdirectories of the main site, so it's not necessary to use subdomains for this reason either.
Unless there's something missing in what you need here, integrating the landing pages into the main site is the vastly superior solution here.
Can you give us an idea what it is about subdomains that you feel you need?
Paul
-
Unless those subdomains for single page sites, may look spammy to google. you can put those pages in your own site, there is nothing to gain using subdomains
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use the on classified listing pages that have expired?
We have went back and forth on this and wanted to get some outside input. I work for an online listing website that has classified ads on it. These ads are generated by companies on our site advertising weekend events around the country. We have about 10,000 companies that use our service to generate their online ads. This means that we have thousands of pages being created each week. The ads have lots of content: pictures, sale descriptions, and company information. After the ads have expired, and the sale is no longer happening, we are currently placing the in the heads of each page. The content is not relative anymore since the ad has ended. The only value the content offers a searcher is the images (there are millions on expired ads) and the descriptions of the items for sale. We currently are the leader in our industry and control most of the top spots on Google for our keywords. We have been worried about cluttering up the search results with pages of ads that are expired. In our Moz account right now we currently have over 28k crawler warnings alerting us to the being in the page heads of the expired ads. Seeing those warnings have made us nervous and second guessing what we are doing. Does anybody have any thoughts on this? Should we continue with placing the in the heads of the expired ads, or should we be allowing search engines to index the old pages. I have seen websites with discontinued products keeping the products around so that individuals can look up past information. This is the closest thing have seen to our situation. Any help or insight would be greatly appreciated! -Matt
Intermediate & Advanced SEO | | mellison0 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Does having a different sub domain for your Landing Page and Blog affect your overall SEO benefits and Ranking?
We have a domain www.spintadigital.com that is hosted with dreamhost and we also have a seperate subdomain blog.spintadigital.com which is hosted in the Ghost platform and we are also using Unbounce landing pages with the sub domain get.spintadigital.com. I wanted to know whether having subdomain like this would affect the traffic metric and ineffect affect the SEO and Rankings of our site. I think it does not affect the increase in domain authority, but in places like similar web i get different traffic metrics for the different domains. As far as i can see in many of the metrics these are considered as seperate websites. We are currently concentrating more on our blogs and wanted to make sure that it does help in the overall domain. We do not have the bandwidth to promote three different websites, and hence need the community's help to understand what is the best option to take this forward.
Intermediate & Advanced SEO | | vinodh-spintadigital0 -
Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi there, got a quick question regarding faceted navigation. If a specific filter (facet) seems to be quite popular for visitors. Does it make sense to replace a dynamic URL e.x http://www.domain.com/pants.html?a_type=239 by a static, more SEO friendly URL e.x http://www.domain.com/pants/levis-pants.html by creating a proper landing page for it. I know, that it is nearly impossible to replace all variations of this parameter URLs by static ones but does it generally make sense to do this for the most popular facets choose by visitors. Or does this cause any issues? Any help is much appreciated. Thanks a lot in advance
Intermediate & Advanced SEO | | ennovators0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Using Canonical URL to poin to an external page
I was wondering if I can use a canonical URL that points to a page residing on external site? So a page like:
Intermediate & Advanced SEO | | llamb
www.site1.com/whatever.html will have a canonical link in its header to www.site2.com/whatever.html. Thanks.0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0