Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using subdomains for related landing pages?
-
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
-
Thanks very much Jane! I think subdirectories are how I'll go.
Effective; organic SEO is HUGE for my initial online success. We market only with direct mail so far. But mailing lists don't address human situations ie: people who've inherited a property AND with it a 2nd mortgage payment AND they're stressed because they can't afford the 2nd payment AND their realtor hasn't sold the inherited property.One last question for all-
With effective landing page SEO & SERP being my primary goal; is the web URL structure term "siloing" familiar to anyone and applicable / adaptable to my multiple landing pages? (I found the term & explanation here: http://www.bruceclay.com/seo/silo.htm) Or is some other method more advisible in order to "pool" my subdirectories for better SEO in SERP? Peter
-
Hi Peter,
In some ways, subdirectories seems even more sensible when you're dealing with single landing pages, as they'll work together somewhat to look like a fuller site from Google's perspective, rather than just a collection of subdomains happening to exist on the same domain.
-
Hello again; after looking at your feedback; then a fresh look at our marketing needs & budget... After viewing each of our competitors sites with keyword 'semi'-stuffing, empty tags, horrible SEO structure, very light traffic & way too much info.... So now we're thinking that we do not need a main site; AND JUST HAVE multiple landing pages each very focused on a single financial or situational motivation causing a property owner to want to sell quickly & we'll explain how we are an alternative than a realtor. Does using subdirectories still seem best for only having single page landing pages? Does anyone have a few informative links regarding setting up & use of subdirectories? Thx, Peter
-
Hi Peter,
I understand that the platform only allows for subdomains. From a purely SEO point of view, subfolders or pages are preferable to subdomains because authority does not appear to pass between a parent domain and its subdomains in the same way as it does between subfolders and parent domains. If your landing page sites are only one pagers, they may be seen as quite thin as well.
However, there is no reason why you can't build quality content like this - it just may take more link building to establish the authority for the subdomains than it would for pages on the same site. You will need to ensure that as much unique content as possible is placed on the landing pages to increase their 'worth' in Google's eyes, given that they are separate from each other on subdomains.
-
Thanks for both responses. Alan- These landing pages would be single page sites. Thompson Paul- The reason I thought sub-domains IS TO SAVE $ with Lander ($ per # domains) and the cost of registering many domains.
Here's the specifics of my search.. The targeted property owner mailing lists are based on data: mortgage, taxes & assessors. They give NO CLUES as to human condition that we look for when our mailers get responses.We have a list of motivations (or reasons for distress to sell their house) are financial or circumstantial: divorce, inheritance, job loss, job transfer, can't sell house, bankrupt, tenant trashed apartment, etc. These motivations are not apparent, obviously, on a mailing list. We want to learn the best way to specifically find people, who own their property in CT, who aren't searching to sell - but are looking for solution to divorce or whatever NOT realizing a cash buyer (us) is a real & UPRIGHT solution. ** We have a list of motivations that we want to define into what phrases people ask in Google to find answers; then what keywords get found for those queries.. and limit it the best we can to CT.** Thanks, Peter
PS:Like Squarespace is drag and drop creation for websites plus hosting, ecommerce & stats; so is www.landerapp.com to landing pages -- they offer customize-able templates that are SEO optimize-able, have great stats & offer drag & drop opt-in forms to integrate into my email service. Comments/advice?
-
Fully agree with Alan - subdomains would be a major waste of effort and SEO value.
Are you thinking you want subdomains perhaps so you can track them differently? There are many ways to do the necessary tracking with pages in subdirectories of the main site, so it's not necessary to use subdomains for this reason either.
Unless there's something missing in what you need here, integrating the landing pages into the main site is the vastly superior solution here.
Can you give us an idea what it is about subdomains that you feel you need?
Paul
-
Unless those subdomains for single page sites, may look spammy to google. you can put those pages in your own site, there is nothing to gain using subdomains
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi there, got a quick question regarding faceted navigation. If a specific filter (facet) seems to be quite popular for visitors. Does it make sense to replace a dynamic URL e.x http://www.domain.com/pants.html?a_type=239 by a static, more SEO friendly URL e.x http://www.domain.com/pants/levis-pants.html by creating a proper landing page for it. I know, that it is nearly impossible to replace all variations of this parameter URLs by static ones but does it generally make sense to do this for the most popular facets choose by visitors. Or does this cause any issues? Any help is much appreciated. Thanks a lot in advance
Intermediate & Advanced SEO | | ennovators0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0