Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using subdomains for related landing pages?
-
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
-
Thanks very much Jane! I think subdirectories are how I'll go.
Effective; organic SEO is HUGE for my initial online success. We market only with direct mail so far. But mailing lists don't address human situations ie: people who've inherited a property AND with it a 2nd mortgage payment AND they're stressed because they can't afford the 2nd payment AND their realtor hasn't sold the inherited property.One last question for all-
With effective landing page SEO & SERP being my primary goal; is the web URL structure term "siloing" familiar to anyone and applicable / adaptable to my multiple landing pages? (I found the term & explanation here: http://www.bruceclay.com/seo/silo.htm) Or is some other method more advisible in order to "pool" my subdirectories for better SEO in SERP? Peter
-
Hi Peter,
In some ways, subdirectories seems even more sensible when you're dealing with single landing pages, as they'll work together somewhat to look like a fuller site from Google's perspective, rather than just a collection of subdomains happening to exist on the same domain.
-
Hello again; after looking at your feedback; then a fresh look at our marketing needs & budget... After viewing each of our competitors sites with keyword 'semi'-stuffing, empty tags, horrible SEO structure, very light traffic & way too much info.... So now we're thinking that we do not need a main site; AND JUST HAVE multiple landing pages each very focused on a single financial or situational motivation causing a property owner to want to sell quickly & we'll explain how we are an alternative than a realtor. Does using subdirectories still seem best for only having single page landing pages? Does anyone have a few informative links regarding setting up & use of subdirectories? Thx, Peter
-
Hi Peter,
I understand that the platform only allows for subdomains. From a purely SEO point of view, subfolders or pages are preferable to subdomains because authority does not appear to pass between a parent domain and its subdomains in the same way as it does between subfolders and parent domains. If your landing page sites are only one pagers, they may be seen as quite thin as well.
However, there is no reason why you can't build quality content like this - it just may take more link building to establish the authority for the subdomains than it would for pages on the same site. You will need to ensure that as much unique content as possible is placed on the landing pages to increase their 'worth' in Google's eyes, given that they are separate from each other on subdomains.
-
Thanks for both responses. Alan- These landing pages would be single page sites. Thompson Paul- The reason I thought sub-domains IS TO SAVE $ with Lander ($ per # domains) and the cost of registering many domains.
Here's the specifics of my search.. The targeted property owner mailing lists are based on data: mortgage, taxes & assessors. They give NO CLUES as to human condition that we look for when our mailers get responses.We have a list of motivations (or reasons for distress to sell their house) are financial or circumstantial: divorce, inheritance, job loss, job transfer, can't sell house, bankrupt, tenant trashed apartment, etc. These motivations are not apparent, obviously, on a mailing list. We want to learn the best way to specifically find people, who own their property in CT, who aren't searching to sell - but are looking for solution to divorce or whatever NOT realizing a cash buyer (us) is a real & UPRIGHT solution. ** We have a list of motivations that we want to define into what phrases people ask in Google to find answers; then what keywords get found for those queries.. and limit it the best we can to CT.** Thanks, Peter
PS:Like Squarespace is drag and drop creation for websites plus hosting, ecommerce & stats; so is www.landerapp.com to landing pages -- they offer customize-able templates that are SEO optimize-able, have great stats & offer drag & drop opt-in forms to integrate into my email service. Comments/advice?
-
Fully agree with Alan - subdomains would be a major waste of effort and SEO value.
Are you thinking you want subdomains perhaps so you can track them differently? There are many ways to do the necessary tracking with pages in subdirectories of the main site, so it's not necessary to use subdomains for this reason either.
Unless there's something missing in what you need here, integrating the landing pages into the main site is the vastly superior solution here.
Can you give us an idea what it is about subdomains that you feel you need?
Paul
-
Unless those subdomains for single page sites, may look spammy to google. you can put those pages in your own site, there is nothing to gain using subdomains
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Should I be using meta robots tags on thank you pages with little content?
I'm working on a website with hundreds of thank you pages, does it make sense to no follow, no index these pages since there's little content on them? I'm thinking this should save me some crawl budget overall but is there any risk in cutting out the internal links found on the thank you pages? (These are only standard site-wide footer and navigation links.) Thanks!
Intermediate & Advanced SEO | | GSO0 -
Hreflang and paginated page
Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
Intermediate & Advanced SEO | | TjeerdvZ
Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.0 -
My landing pages don't show up in the SERPs, only my frontpage does.
I am having some trouble with getting the landing pages for a clients website to show up in the SERPs.
Intermediate & Advanced SEO | | InmediaDK
As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/
It is optimzied for the keywords "Vinhandel Århus". Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword.
What am I doing wrong here?1 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Subdomains for niche related keywords
I wanted to know how efficient using a subdomain is, taking in consideration all the updates Google has made lately. I am looking to use a subdomain for a well branded website for a niche specific part of their website. The subdomain will end-up having more than 100 pages. I'd like to see in what cases do you guys recommend using a subdomain? How to get the same benefit out of a subdomain as i am getting from the actual main domain?
Intermediate & Advanced SEO | | CMTM0