"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
-
Hey moz
New client has a site that uses:
subdomains ("third-level" stuff like location.business.com) and;
"fourth-level" subdomains (location.parent.business.com)
Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly.
These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
-
If you check out Rand's Intro to SEO slideshare (http://www.slideshare.net/randfish/introduction-to-seo-5003433) slide 46 and 47 talk about URL structure and specifically sub-domains.
As Rob said you do want to sub-folder structures and avoid sub-domains. Hopefully you are old enough to remember when websites like lycos.com were big and people could make their own websites. These were all hosted on subdomains like moz.tripod.lycos.com and because of this structure search engines needed to see subdomains as separate websites. For this reason they have separate grading, change the flow of link juice and can easily count as duplicate content.
Sub-domains are best utilized for information that is distinct enough. Like in the moz example Rands personal blog could theoretically sit at rand.moz.com as its a separate theme, different content, etc it would just loose out on the flow of value.
Once again Rob is right about using 301 redirects to move your subdomains into folders.
Now moving on to the more specific nature of your question "Are fourth level sub-domains any worse than third level sub-domains" I am going to suggest that when asking such a question you've already lost a big chunk of the SEO/inbound marketing battle.
The question you are framing is "I know it isn't good - but is it any worse?" Well even if it's not any worse you already know that it's not great and you should be taking structural steps to build on a sites accessibility, user functionality and it's SEO. If you find yourself asking "Is X any worse?" "How bad is Y?" "Can I get away with Z?" then you should immediately stop pursuing that idea and try and find a different method.
In this case that method is sub-folders and a 301 migration, but remember the framing of your questions and your over all directional strategy need to change to really drive home your campaigns!
-
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..
Thanks anyways, glad I could be of some help.
-
Again - thanks a lot. I totally agree. Next client meeting I'll stress that not only do Ifeel strongly about the subfolder issue, but the good people at SimplifySEO feel the same:) And they know their ish. Or something.
-
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go (as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
Hey Rob,
Thanks a lot for this. This is great advice and really well-written. And you're preaching to the choir. I also prefer subfolders, but it's just not in the cards for this client for the time being. As it stands, we're stuck with subdomains.
Any other thoughts re: fourth-level vs. third-level domains, folks?
-
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
"Yet-to-be-translated" Duplicate Content: is rel='canonical' the answer?
Hi All, We have a partially internationalized site, some pages are translated while others have yet to be translated. Right now, when a page has not yet been translated we add an English-language page at the url https://our-website/:language/page-name and add a bar for users to the top of the page that simply says "Sorry, this page has not yet been translated". This is best for our users, but unfortunately it creates duplicate content, as we re-publish our English-language content a second time under a different url. When we have untranslated (i.e. duplicate) content I believe the best thing we can do is add which points to the English page. However here's my concern: someday we _will_translate/localize these pages, and therefore someday these links will _not _have duplicate content. I'm concerned that a long time of having rel='canonical' on these urls, if we suddenly change this, that these "recently translated, no longer pointing to cannonical='english' pages" will not be indexed properly. Is this a valid concern?
Technical SEO | | VectrLabs0 -
Odd scenario: subdomain not indexed nor cached, reason?
hi all hopefully somebody can help me with this issue 🙂 6 months ago a number of pages hosted at a domain level have been moved to a subdomain level with 301redirects + some others were created from scratch ( at a subdomain level too). what happens is that not only the new urls at the subdomain level are not indexed nor cached, but the old urls are still indexed in google, although by clicking on them they bring to the new urls via 301 redirect. question is why having a 301 redirects to the new urls, no issues with robot.txt, metarobots etc, the new urls are still de-indexed? i might remind you that a few (100 pages or so) have been created from scratch, but they are also not indexed. the only issue found across the page is the no-cache line of code set as follow: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache i am not familiar with cache control lines. Can this be an issue from a correct indexing? thanks in advance Dario
Technical SEO | | Mrlocicero0 -
Need suggestions on what might be causing rankings drop from top5 to "not in 50"?
Hi All, Below a list of 4 keywords & respective URLs which raked in top 3 to 5 till around 2 months back, now all these are "not in top 50", and I need help with finding the exact reason. Can you all please help with suggestions on what I should be looking for under the hood. Oticon Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/oticon-hearing-aids.aspx Phonak Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/phonak-hearing-aids.aspx Widex Hearing Aids: http://www.leightonshearingcare.co.uk/hearing-aids/widex-hearing-aids.aspx Resound Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/siemens-hearing-aids.aspx Thanks in advance, any help will be very much appreciated, checked all the basic stuff, and appreciate that there is scope for improvement in terms of page content, internal links etc etc, but cant figure out the reason for such a massive drop in such a short while given the fact, that the these URLs ranked in top 3 to 5 for a few years till 2 months back. Please help!!!
Technical SEO | | LolhcSEO0 -
Changing a blog url from subdomain to subfolder
I am abou to change my company blog from a subdomain (blog.mydomain.com) to a subfolder (mydomain.com/blog), from suggestions from this awesome community! Not only that though, because the current blog is on another server than the main site I have to move my blog between servers as well. This will be a big hassle for me, and means a big risk for errors as I don't have a clue what I am doing on the development part. Hint: I'm no developer. My blog is fairly new, having posted 18 blog posts so far. There is no major linking to or from the blog as it has been basically no activity on the blog. It has been fairly good optimized for SEO, with custom plugin settings for Wordpress SEO plugin and similar. Also followed advice from Rand regarding wordpress SEO. So I guess my question is: Would it be a big loss for me to just start over with a new blog on the subfolder domain? And move content over from the old blog manually (and then deleting the old one). Or would It be plain stupid taking that route? Thankfull for all help I can get!
Technical SEO | | danielpett0 -
With or without "/" at the end of domain
Hello, A client domains appear sometimes like www.domain.co.uk and sometimes like www.domain.co.uk/ I would like to place redirects from URLs that contain strings such as /index.aspx?id=42 to the main page but which one should I pick? With or without the "/" ? Thank you
Technical SEO | | DavidSpivac0 -
Why "title missing or empty" when title tag exists?
Greetings! On Dec 1, 2011 in a SEOMoz campaign, two crawl metrics shot up from zero (Nov 17, Nov 24). "Title missing or empty" was 9,676. "Duplicate page content" was 9,678. Whoa! Content at site has not changed. I checked a sample of web pages and each seems to have a proper TITLE tag. Page content differs as well -- albeit we list electronic part numbers of hard-to-find parts, which look similar. I found a similar post http://www.seomoz.org/q/why-crawl-error-title-missing-or-empty-when-there-is-already-title-and-meta-desciption-in-place . In answer, Sha ran Screaming Frog crawler. I ran Frog crawler on a few hundred pages. Titles were found and hash codes were unique. Hmmm. Site with errors is http://electronics1.usbid.com Small sample of pages with errors: electronics1.usbid.com/catalog_10.html
Technical SEO | | groovykarma
electronics1.usbid.com/catalog_100.html
electronics1.usbid.com/catalog_1000.html I've tried to reproduce errors yet I cannot. What am I missing please? Thanks kindly, Loren0 -
Top Level Domains
Howdy Everyone, I have a website that will span multiple countries. The content served will be different for each country. As such, I've acquired the top level domains for different countries. I want to map the cop level domains (e.g. domain.co.uk) to uk.domain.com for development purposes (LinkedIn does this). I'm curious to know whether this is adviseable and if mapping a country-specific TLD to a subdomain will maintain local SEO value. Thanks!
Technical SEO | | RADMKT-SEO0