"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
-
Hey moz
New client has a site that uses:
subdomains ("third-level" stuff like location.business.com) and;
"fourth-level" subdomains (location.parent.business.com)
Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly.
These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
-
If you check out Rand's Intro to SEO slideshare (http://www.slideshare.net/randfish/introduction-to-seo-5003433) slide 46 and 47 talk about URL structure and specifically sub-domains.
As Rob said you do want to sub-folder structures and avoid sub-domains. Hopefully you are old enough to remember when websites like lycos.com were big and people could make their own websites. These were all hosted on subdomains like moz.tripod.lycos.com and because of this structure search engines needed to see subdomains as separate websites. For this reason they have separate grading, change the flow of link juice and can easily count as duplicate content.
Sub-domains are best utilized for information that is distinct enough. Like in the moz example Rands personal blog could theoretically sit at rand.moz.com as its a separate theme, different content, etc it would just loose out on the flow of value.
Once again Rob is right about using 301 redirects to move your subdomains into folders.
Now moving on to the more specific nature of your question "Are fourth level sub-domains any worse than third level sub-domains" I am going to suggest that when asking such a question you've already lost a big chunk of the SEO/inbound marketing battle.
The question you are framing is "I know it isn't good - but is it any worse?" Well even if it's not any worse you already know that it's not great and you should be taking structural steps to build on a sites accessibility, user functionality and it's SEO. If you find yourself asking "Is X any worse?" "How bad is Y?" "Can I get away with Z?" then you should immediately stop pursuing that idea and try and find a different method.
In this case that method is sub-folders and a 301 migration, but remember the framing of your questions and your over all directional strategy need to change to really drive home your campaigns!
-
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..
Thanks anyways, glad I could be of some help.
-
Again - thanks a lot. I totally agree. Next client meeting I'll stress that not only do Ifeel strongly about the subfolder issue, but the good people at SimplifySEO feel the same:) And they know their ish. Or something.
-
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go (as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
Hey Rob,
Thanks a lot for this. This is great advice and really well-written. And you're preaching to the choir. I also prefer subfolders, but it's just not in the cards for this client for the time being. As it stands, we're stuck with subdomains.
Any other thoughts re: fourth-level vs. third-level domains, folks?
-
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bloking pages in roborts.txt that are under a redirected subdomain
Hi Everyone, I have a lot of Marketo landing pages that I don't want to show in SERP. Adding the noindex meta tag for each page will be too much, I have thousands of pages. Blocking it in roborts.txt could have been an option, BUT, the subdomain homepage is redirected to my main domain (with a 302) so I may confuse search engines ( should they follow the redirect or should they block) marketo.mydomain.com is redirected to www.mydomain.com disallow: / (I think this will be confusing with the redirect) I don't have folders, all pages are under the subdomain, so I can't block folders in Robots.txt also Would anyone had this scenario or any suggestions? I appreciate your thoughts here. Thank you Rachel
Technical SEO | | RaquelSaiz0 -
Compare to Front End view source contain more tageting keywords any impact on seo?
Hi All, I am using chrome browser and in my category page when i search my keyword like example Acer Laptop it is showing me 40 searches for that keyword in front end while in view source it is showing 100. Now for same category of my competitor page when i search Acer Laptop it shows 20 searches in front end and 300 in view source. So my question is - Is it a good practice I have seen my competitor page he is using tab button where lot of geunine content is there that is the reason hiss view source keyword search is more. So will it impact on seo?
Technical SEO | | miteshseo0 -
What to do with "show all" page
Hello, What should I do with the following situation: In e-commerce shop I have an option to "show all products" (list all products in one page) - do I need to put canonnical or 301 redirect to somewhere or should I leave as normal page - I think google consider this is as duplicate since everything is the same (only number of products is different) ? Regards, Nenad
Technical SEO | | Uniline0 -
Need suggestions on what might be causing rankings drop from top5 to "not in 50"?
Hi All, Below a list of 4 keywords & respective URLs which raked in top 3 to 5 till around 2 months back, now all these are "not in top 50", and I need help with finding the exact reason. Can you all please help with suggestions on what I should be looking for under the hood. Oticon Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/oticon-hearing-aids.aspx Phonak Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/phonak-hearing-aids.aspx Widex Hearing Aids: http://www.leightonshearingcare.co.uk/hearing-aids/widex-hearing-aids.aspx Resound Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/siemens-hearing-aids.aspx Thanks in advance, any help will be very much appreciated, checked all the basic stuff, and appreciate that there is scope for improvement in terms of page content, internal links etc etc, but cant figure out the reason for such a massive drop in such a short while given the fact, that the these URLs ranked in top 3 to 5 for a few years till 2 months back. Please help!!!
Technical SEO | | LolhcSEO0 -
"Daily Special" = Duplicate Content?
I believe this has been addresses and answered previously, but despite searching the Q&A archives, I was unable to find the question and answer. So, please be gentle and patient: We have an eCommerce site with several hundred products, most of which use the structure: www.mysite.com/subcategory/itemA.html. We wish to feature itemA as a "daily special" item, and our Magento developer has recommended: www.mysite.com/internet-daily-special/**itemA.html ** Because itemA.html is the same page—albeit following a different path—will Google see this as duplicate content? Thanks.
Technical SEO | | RScime250 -
A rel="canonical" to www.homepage.com/home.aspx Hurts my Rank?
Hello, The CMS that I use makes 3 versions of the homepage:
Technical SEO | | EvolveCreative
www.homepage.com/home.aspx homepage.com homepage.com/default.aspx By default the CMS is set to rel=canonical all versions to the www.homepage.com/home.aspx version. If someone were to link to a website they most likely aren't going to link to www.homepage.com/home.aspx, they'll link to www.homepage.com which makes that link juice flow through the canonical to www.homepage.com/home.aspx right? Why make that extra loop at all? Wouldn't that be splitting the juice? I know 301's loose 1-5 % juice, but not sure about canonical. I assume it works the same way? Thanks! http://yoursiteroot/0 -
Does Google follow links in "id" tag?
Hello, For functionality purposes, I need to wrap separate blocks of content with a tag. The main question is whether Google will follow this URL, even though it is not a hyperlink on the page, just a URL used for functionality purposes. We will have 10-20 of these types of span tags with a different URL for each one, and we just want to be sure that Google will not be following these URLs that are not links. Thanks!
Technical SEO | | Hakkasan0 -
Honeypot Captcha - rated as "cloaked content"?
Hi guys, in order to get rid of our very old-school captcha on our contact form at troteclaser.com, we would like to use a honeypot captcha. The idea is to add a field that is hidden to human visitors but likely to be filled in by spam-bots. In this way we can sort our all those spam contact requests.
Technical SEO | | Troteclaser
More details on "honeypot captchas":
http://haacked.com/archive/2007/09/11/honeypot-captcha.aspx Any idea if this single cloaked field will have negative SEO-impacts? Or is there another alternative to keep out those spam-bots? Greets from Austria,
Thomas0