Is there any benefit in using a subdomain redirected to a single page?
-
For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers etc.
Is there a benefit ?
Cheers
-
Thanks Rick. When you say unless links are involved what do you mean?
-
There will be only a single benefit, which is tracking. Separate subdomains will allow you track visitors properly. No positive or negative result - unless links are involved.
-
Having looked at that white board Friday I did find it helpful.
I did just go look at wotif.com.au and lastminute.com.au one of which I do recall using subdomains to divide their sites with. Neither appear to be using it any more. Which would be another indication that subdomains are in fact bad.
Seems to be subdomains are not really the way to go which from my point of view is a shame. It makes more sense to work that way.
-
Hi David,
Rand covered this very topic in a white board friday. Perhaps you may find it helpful and provide insight on what can happen and why he thinks the way he does.
Hope it helps,
Don
-
The main reasoning behind wishing to use a subdomain is more organisational.
Simply looking at having the subdomain house information on a particular topic or item, for instance screwdrivers in Brisbane. Any deals, latest arrivals etc could be found on that particular subdomain. And further to that thinking being able to redirect to a different page for 2 weeks and then bring the original page back with out changing or adding a new url on which it can be found.
Possibly just me and the way I like things organisationally but the idea appealed and I was wondering if there were any benefits or for that matter negatives to running a particular section that way.
-
Hi David. The benefits associated with 301 redirection come from either relocating your site, combining sites, cleaning up 404 pages, aligning page names within your site architecture, things of that nature. If you have links or visits to those third level pages and want to house all pages on your root domain instead of third levels, then 301 redirection would be the way to go. Cheers!
-
There would not be a direct SEO benefit for doing this. There maybe however a benefit in tracking. If you only used that sub-domain for X ad campaign than you would know all traffic from referral sub-domain would be coming from that ad campaign.
There may be some slight non-optimization for doing it this way. Sub-domains are treated as their own domains to a degree, so you are in affect giving the ad-campaign's link to juice to a new domain entirely. Then forwarding that to a specific page. Opposed to just directly giving the link juice an ad campaign can generate to the actual page.
A couple things here depending on the type of ad campaign there may not be any link juice to worry about, like Google's ad words don't pass link juice. However, if you purchased direct advertisement on certain sites you may get some link juice from those ads running.
The second thing is actually a question. What is the purpose of creating a sub-domain to point to a sub directory? Is it just for tracking? Or were you wondering if you could benefit from a sub-domain being treated as a new domain linking to you? If for tracking; I would think there are other tracking methods that could handle referring traffic. If it were in hopes of gaining a new backlink from a different domain than I would say it isn't helpful this way. First because it is simply forwarding to the sub-directory and secondly even it weren't forwarding the link would be considered from the same server and not very helpful anyway.
So in short, no benefit other than a potential way to help with tracking.
Hope that makes sense and helps,
Don
edit some grammar
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Using Product Page Content from an Offline Website
Hi all, We have two websites. One of the website's no longer sells product range A. However, on the second website, we would like to sell range A. We paid a copywriter to write some really good content for these ranges and we were wondering if we would get stung for duplicate content if we took these descriptions from website 1 and placed them on website 2. The products / descriptions are live anymore and haven't been for about 6 weeks. We're ranking for some great keywords at the moment and we don't want to spoil that. Thanks in advance! D
Technical SEO | | 10dales0 -
Which Pagination/Canonicalization Page Selection Approach Should be Used?
Currently working on a retail site that has a product category page with a series of pages related to each other i.e. page 1, page 2, page 3 and Show All page. These are being identified as duplicate content/title pages. I want to resolve this through the applications of pagination to the pages so that crawlers know that these pages belong to the same series. In addition to this I also want to apply canonicalization to point to one page as the one true result that rules them all. All pages have equal weight but I am leaning towards pointing at the ‘Show All’. Catch is that products consistently change meaning that I am sometimes dealing with 4 pages including Show All, and other times I am only dealing with one page (...so actually I should point to page 1 to play it safe). Silly question, but is there a hard and fast rule to setting up this lead page rule?
Technical SEO | | Oxfordcomma0 -
My homepage+key pages have dropped 40+ positions after implementing redirects and canonical changes. HELP!
Hi SEOMozers, I work for a web based nonprofit at www.tisbest.org. I had a professional contact recommend that we work on our redirects to our homepage because we were losing valuable rank benefit. This combined with getting sick of seeing our weekly SEOMoz crawl reports show 304 duplicate page and title errors for months. No one could seem to figure out what was happening (we think it had to do with session stuff; we were seeing several versions of each page showing the following: www.tisbest.org/default.aspx/(random character string) My developer and I read a bunch of articles and started making changes 10 days ago: He setup 301 redirects from http://tisbest.org to http://www.tisbest.org. (set the canonical domain). We did a redirect from http://www.tisbest.org/default.aspx to root with "/". I set the canonical setting to www.tisbest.org in our webmaster tools. In our web config (we're running in asp.net), we changed our session detection from auto-detect then saw some session funkiness so we changed it back. Though we do think the character strings we were seeing were session GUID. He forced lower case URL’s to reduce duplicate page content/titles. I got my weekly crawl report 9 days ago and we had dropped from 340 duplicate page title and page content errors went to one. We went nuts and felt like the kings of SEO. Then, yesterday (9/28), the SEO grim reaper came knocking when I received my weekly SEOMoz ranking report. It said we had dropped 40+ spots for all of 9 of our keywords. Sure enough, I searched our keywords and our website was gone. Then I searched our company name, tisbest, and only a few of our pages show but not the homepage. I searched for our URL www.tisbest.org, and I originally got the expanded view (with 8 links to various webpages - can't remember what this view is called) but now, today (Saturday), the expanded view is gone from this search result. Also, when I run the On Page Report card for our homepage, I get the following error message with no results: "We were unable to grade that page. The page did not load. Curl::Err::TooManyRedirectsError: Number of redirects hit maximum amount." When I run the Open Site explorer report, I get this message at the top: Oh Hey! It looks like that URL redirects to www.tisbest.org/?AspxAutoDetectCookieSupport=1. Would you like to see data for <a class="clickable redirects">that URL instead</a>?" If I go to the report for the that report's page, it says that "No information is available for that URL." Just tonight (night of 9/29), our developer added the rel="canonical" href="http://www.tisbest.org" /> to our homepage tonight to see if that would help. We did not do that originally. In our Google Webmaster tools, I am seeing the number of URL Error - Not Followed has sky rocked. I have attached a screen capture to this thread. There are also a large number of URL Errors - Not Found errors as well. I did some research tonight and downloaded and ran Screaming Frog SEO Crawler. I have attached a screen capture below with this report and a couple of questions I sent our developer that may be helpful to you. Also, not sure if this is relevant, we use a master page that all of our pages inherit from so all of our pages get the same meta-data: name="keywords" content="charitable gift card, charitable gift certificate, non profit gift card, charity donation, giftcard, charity gift card, donation gift card, donation gift, charity gift, animal gift card, animal gift, environmental gift card, environmental gift, humanitarian gift card, humanitarian gift, christian gift card, christian gift, catholic gift card, catholic gift, religious gift card, religious gift" />id="ctl00_metaDescription" name="description" content="Award winning Charity Gift Card, for over 250 premier charities. A customized donation gift that makes the world better. TisBest is BBB Accredited." />name="google-site-verification" content="EfJIhN3h2SVSXdSpUbfceBVw2q6zrGX8rRQhdNZ1xY8" /><title></span><span> </span></p> <p>Can anyone help me/us identify the issue that obliterated our rankings? I am happy to give an information needed. Thank you! Chad Edwards</p> <a download="Bqcu1.png" class="imported-anchor-tag" href="http://i.imgur.com/Bqcu1.png" target="_blank">Bqcu1.png</a> <a download="ZXQ8d.png" class="imported-anchor-tag" href="http://i.imgur.com/ZXQ8d.png" target="_blank">ZXQ8d.png</a></title>
Technical SEO | | TisBest0 -
Redirects
I have a question about 404ed domains and old domains. #1 A domain has many links to it, but has been 404ed for 4 months. Should I redirect to a page I own and is almost exactly the same content. Will the fact that it was once 404ed be an issue? #2 I have an old domain that has many links but has been stagnant for a long time. Are these links still valuable and I should I redirect them to an important page on a different site? Does penguin influence your advice?
Technical SEO | | tylerfraser0 -
301 Redirect Domain or 301 Redirect Domain + Interior Pages
Hello - My company acquired another company in our industry and our IT team immediately set up the acquired companies domain name as a an alias to our site. This created a duplicate version of our website under another domain name and Google started ranking interior pages from the aliased acquired site for several top keywords that were previously held by our real site. Should we 301 redirect just the top level domain name of the acquired site to the real site or 301 redirect the top level domain name and the interior pages on the acquired site to help ensure that our real domain will take back the rankings it once had? Thanks!
Technical SEO | | Room2140 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0 -
Https redirect
Hi there, a client of mine is asking me if Google would penalize to redirect from all the http urls to https (they want to change the security protocol). I assume it is going to work as a classic 301, right? so they might lose some authority in they way, but I am not 100% sure. Can anyone confirm this? does anyone has a similar experience? thanks a lot!
Technical SEO | | elisainteractive0