Sub Domain rel=canonical to Main Domain
-
Just a quick one, i have the following example scenario.
Main Domain: http://www.test.com
Sub Domain: http://sub.test.com
What I am wondering is I can add onto the sub domain a rel=canonical to the main domain. I dont want to de-index the whole sub domain just a few pages are duplicated from the main site.
Is it easier to de-index the individual sub domain pages or add the rel=canonical back to the main domain.
Much appreciated
Joseph
-
Canonicalizing http://sub.test.com to http://www.test.com will tell Google that those two individual pages are the same and it should only index http://www.test.com--it won't affect other pages on the subdomain.
If you only have a few subdomain pages that are duplicates of root domain pages, you can just use canonicals to indicate which pages you want indexed. You won't need to noindex the duplicate ones--they will fall out of Google's index naturally once Google sees which are the preferred, canonical ones.
-
Hi there,
The Rel=canonical will not de-index those pages.
To de-index, add a rel=robots with a noindex tag. And also add the rel=canonical or just redirect 301 the page. Of course, do it to those selected pages only.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta tags for international domains
Hi Mozers I have 3 top level domains co.nz com.au and com each meta tag for the home page is unique for each country and I have tried to figure this out for the last 3 months, but unfortunately I can't seem to pin point why all 3 meta tags are showing up exactly the same. It seems all meta tags are showing up for the co.nz domain. In the attachments you can see all urls are showing up correctly for each country specific domain, however the meta tag description defaults to the NZ Any help around this would be much appreciated! Thanks all
White Hat / Black Hat SEO | | edward-may0 -
Redirecting location-specific domains
I am working on a project for a physician who only cares about reaching patients within a specific geographic region. He has a new technique at his practice and wants to get the word out via radio spots. I want to track the effectiveness of the radio campaigns without the use of call-tracking numbers or special promo codes. Since the physician's primary domain is very long (but well-established), my thought is to register 3-4 short domains referencing the technique and location so they would be easy for listeners to remember and type-in later. 301 these domains to the relevant landing page on the main domain. As an alternative. Each domain could be a single relevant landing page with a link to the relevant procedure on the main site. It's not as if there is anything deceptive going on, rather, I would simply be using a domain in place of a call tracking number. I think I should be able to view the type-in traffic in Analytics, but would Google have an issue with this? Thoughts and suggestions appreciated!
White Hat / Black Hat SEO | | SCW0 -
Rel Noindex Nofollow tag vs meta noindex nofollow robots
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. PSS: another reason why this needs looking at is because search engines won't be able to make an interpretation of these pages (until they have been cleaned up and fleshed out with unique content) which could result in bad ranking of the pages which could conclude to my users not being satisfied, so over and above the SEO factor, usability of the site is being looked at here as well, I don't want my users to land on these pages atm. If they navigate to it via the filters then awesome because they are defining what they are looking for with the filters. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
White Hat / Black Hat SEO | | GroupeDSI0 -
What to do about all of the other domains we own?
So I had asked this question a while back in a previous thread and thought I had the correct answer to it, but just actually heard differently on a webinar by Dr. Pete. Basically, we have a large number of domains that just replicate our website. Some are brand names, some are exact match keyword domains, some are clever plays on words. This is a tactic that our marketing department thought was a good idea. Obviously its not. My question is - Some of these domains actually have a significant amount of link value coming into them. How people found them I'm not sure, but nonetheless, I want to try to take advantage of the incoming links somehow if possible. Dr. Pete recommended against 301 redirecting back to our main domain all at once because that would be a signal to Google that something fishy is going on. This is what I was going to do, but now I'm really not sure what to do now... If possible, it would be great to get Dr. Pete in this thread to get his comments. I wasn't able to get an answer on the SEO in 2012 Pro Webinar.
White Hat / Black Hat SEO | | CodyWheeler0 -
How can I make use of multiple domains to aid my SEO efforts?
About an year, the business I work for purchased 20+ domains: sendmoneyfromcanada.com sendmoneyfromaustralia.com sendmoneyfromtheuk.com sendmoneyfromireland.com The list goes on, but you can get the main idea. They thought that the domains can be useful to aid http://www.transfermate.com/ . I can set up a few micro sites on them, but from that point there will be no one to maintain them. And I'm, honestly, not too happy with hosting multiple sites on one IP and having them all link to the flagship. It is spammy and it does not bring any value to end users. I might be missing something, so my question is - Can I use these domains to boost my rankings, while avoiding any shady/spammy techniques? P.S. I had this Idea of auctioning the domains in order to cover for the domain registration fees.
White Hat / Black Hat SEO | | Svetoslav0