What is the advantage of using sub domains instead of pages on the root domain?
-
Have a look at this example
http://bannerad.designcrowd.com/
For each category of design, they have a landing page on the sub domain.
Wouldn't it be better to have them as part of the same domain?
What is the strategy behind using sub domains?
-
Hey designquotes - It's been a while since you posted this question. If you got the answers you needed, could you mark as "Good Answer" whichever answers you found most helpful and mark the overall question as answered? This will help other users who may come across the question in the future.
Thanks!
Paul
-
As Paul said, there is a lot more effort required to try and SEO each sub-domain as Google will see each as a separate entity.
However, there are times when a subdomain might be the preferred option - Just had a quick glance and can't see any reason why they would have chosen this route.
Andy
-
Unless there's a tactical reason for needing separate subdomains which isn't apparent to the visitor, IMO this is a strategic error to be running all of these categories as subdomains.
They are requiring far more effort of themselves by having to work to rank each subdomain separately instead of taking advantage of the aggregation of authority that occurs with sub-folders and not with subdomains.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a tool out there to check any domain that might be pointing to my existing domain?
Is there a tool out there to check any domain that might be pointing to my existing domain?
Technical SEO | | adlev0 -
How to de-index a page with a search string with the structure domain.com/?"spam"
The site in question was hacked years ago. All the security scans come up clean but the seo crawlers like semrush and ahrefs still show it as an indexed page. I can even click through on it and it takes me to the homepage with no 301. Where is the page and how to deindex it? domain/com/?spam There are multiple instances of this. http://www.clipular.com/c/5579083284217856.png?k=Q173VG9pkRrxBl0b5prNqIozPZI
Technical SEO | | Miamirealestatetrendsguy1 -
Moving to old site to new domain sub directory
Hi, we've moved our old site to a new domain but in a subdirectory (the shopping site has been consolidated into overarching company website's shopping section, thus the move to sub dir). Are 301 redirects from old URLs to new domain's subdirectory ex newsite.com/shopping/page-1/ sufficient for site migration? I wasn't able to use Google's site address change tool since we're moving to a subdirectory on the new domain. Thanks
Technical SEO | | SoulSurfer80 -
Instead of a 301, my client uses a 302 to custom 404
I've found about 900 instances of decommissioned pages being redirected via 302 to a 404 custom page, even when there's a comparable page elsewhere on the site or on a new subdomain. My recommendation would be to always do a 301 from the legacy page to the new page, but since they're are so many instances of this 302->404 it seems to be standard operating procedure by the dev team. Given that at least one of these pages has links coming from 48 root domains, wouldn't it obviously be much better to 301 redirect it to pass along that equity? I don't get why the developers are doing this, and I have to build a strong case about what they're losing with this 302->404 protocol. I'd love to hear your thoughts on WHY the dev team has settled on this solution, in addition to what suffers as a result. I think I know, but would love some more expert input.
Technical SEO | | Jen_Floyd0 -
Block or remove pages using a robots.txt
I want to use robots.txt to prevent googlebot access the specific folder on the server, Please tell me if the syntax below is correct User-Agent: Googlebot Disallow: /folder/ I want to use robots.txt to prevent google image index the images of my website , Please tell me if the syntax below is correct User-agent: Googlebot-Image Disallow: /
Technical SEO | | semer0 -
Two different page authority ranks for the same page
I happened to notice that trophycentral.com and www.trophycentral.com have two different page ranks even though there is a 301 redirect. Should I be concerned? http://trophycentral.com Page Authority: 47 Domain Authority: 42 http://www.trophycentral.com Page Authority: 51 Domain Authority: 42 Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
301 redirecting some pages directly, and the rest to a single page
I've read through the Redirect guide here already but can't get this down in my .htaccess I want to redirect some pages specifically (/contactinfo.html to the new /contact.php) And I want all other pages (not all have equivalent pages on the new site) to redirect to my new (index.php) homepage. How can I set it up so that some specific pages redirect directly, and all others go to one page? I already have the specific oldpage.html -> newpage.php redirects in place, just need to figure out the broad one for everything else.
Technical SEO | | RyanWhitney150 -
Consolidating domains on root subject
I had several sites regarding a core subject. The main site was xyz.com and then I had multiple sites for the inner categories of xyz.com . I decided to consolidate everything to a single site and so did a change of address in webmaster tools and 301 redirected all niche sites to the appropriate page on the primary site of xyz.com. Shortly after we lost rankings for those niche pages. Any thoughts?
Technical SEO | | Hakkasan0