Is there any benefit in using a subdomain redirected to a single page?
-
For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers etc.
Is there a benefit ?
Cheers
-
Thanks Rick. When you say unless links are involved what do you mean?
-
There will be only a single benefit, which is tracking. Separate subdomains will allow you track visitors properly. No positive or negative result - unless links are involved.
-
Having looked at that white board Friday I did find it helpful.
I did just go look at wotif.com.au and lastminute.com.au one of which I do recall using subdomains to divide their sites with. Neither appear to be using it any more. Which would be another indication that subdomains are in fact bad.
Seems to be subdomains are not really the way to go which from my point of view is a shame. It makes more sense to work that way.
-
Hi David,
Rand covered this very topic in a white board friday. Perhaps you may find it helpful and provide insight on what can happen and why he thinks the way he does.
Hope it helps,
Don
-
The main reasoning behind wishing to use a subdomain is more organisational.
Simply looking at having the subdomain house information on a particular topic or item, for instance screwdrivers in Brisbane. Any deals, latest arrivals etc could be found on that particular subdomain. And further to that thinking being able to redirect to a different page for 2 weeks and then bring the original page back with out changing or adding a new url on which it can be found.
Possibly just me and the way I like things organisationally but the idea appealed and I was wondering if there were any benefits or for that matter negatives to running a particular section that way.
-
Hi David. The benefits associated with 301 redirection come from either relocating your site, combining sites, cleaning up 404 pages, aligning page names within your site architecture, things of that nature. If you have links or visits to those third level pages and want to house all pages on your root domain instead of third levels, then 301 redirection would be the way to go. Cheers!
-
There would not be a direct SEO benefit for doing this. There maybe however a benefit in tracking. If you only used that sub-domain for X ad campaign than you would know all traffic from referral sub-domain would be coming from that ad campaign.
There may be some slight non-optimization for doing it this way. Sub-domains are treated as their own domains to a degree, so you are in affect giving the ad-campaign's link to juice to a new domain entirely. Then forwarding that to a specific page. Opposed to just directly giving the link juice an ad campaign can generate to the actual page.
A couple things here depending on the type of ad campaign there may not be any link juice to worry about, like Google's ad words don't pass link juice. However, if you purchased direct advertisement on certain sites you may get some link juice from those ads running.
The second thing is actually a question. What is the purpose of creating a sub-domain to point to a sub directory? Is it just for tracking? Or were you wondering if you could benefit from a sub-domain being treated as a new domain linking to you? If for tracking; I would think there are other tracking methods that could handle referring traffic. If it were in hopes of gaining a new backlink from a different domain than I would say it isn't helpful this way. First because it is simply forwarding to the sub-directory and secondly even it weren't forwarding the link would be considered from the same server and not very helpful anyway.
So in short, no benefit other than a potential way to help with tracking.
Hope that makes sense and helps,
Don
edit some grammar
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating 301 Redirects to Decrease Page Load Times - Major Concerns?
Hello, I am being pushed to consolidate our over 6k redirects that have accumulated over the course of 4 years. These redirects are one of the many factors causing extensive load times for our website. Many to most or over a year old, have not been used, or simply redirect back to the home page. Other than looking to keep the pages that have external links (also looking for recommendations/tools), are there other best practices from an SEO stand point to ensure there are no major hits to our website. A little more info, I am looking to pair 6K down by Removing all Redirects that have not been used Removing all redirects that are over 1 yr+ Remove all redirects that redirect to simply the home page or a smaller big bucket subfolder
Technical SEO | | Owner_Account
This should take the number from 6K to around 300. Are there any major concerns? Pat0 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Should I use Event Schema for a page that reports on an event?
I have a question about using Schema data. Specifically: Should I use Event Schema for a page that reports on an event? I provide high-quality coverage (reporting) about new products being introduced at an industry trade show. For the event, I create a single page using the event name, and provide a great deal of information on how to attend the show, the best places to stay and other insider tips to help new attendees. Then during the show, I list the new products being introduced along with photos and videos. Should I use event schema data for this page, or does Google only want the event organizer to use that data? Any benefits or drawbacks to using event schema? Thanks! Richard
Technical SEO | | RichardInFlorida0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
Can SEOMoz crawl a single page as oppose to an entire subfolder?
I would like the following page to be crawled: http://www.ob.org/_programs/water/water_index.asp Instead, SEOMoz changes the page to the following subfolder which is an invalid url: http://www.ob.org/_programs/water/
Technical SEO | | OBIAnalytics0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0