How can you promote a sub-domain ahead of a domain on the SERPs?
-
I have a new client that wants to promote their subdomain uk.imagemcs.com and have their main domain imagemcs.com fall off the SERPs.
Objective? Get uk.imagemcs.com to rank first for UK 'brand' searches.
Do a search for 'imagem creative services' and you should see the issue (it looks like rules have been applied to the robots.txt on the main domain to exclude any bots from crawling - but since they've been indexed previously I need to take action as it doesn't look great!).
I think I can do this by applying a permanent redirect from the main domain to the subdomain at domain level and then no-indexing the site - and then resubmit the sitemap.
My slight concern is that this no-indexing of the main domain may impact on the visibility of the subdomains (I'm dealing with uk.imagemcs.com, but there is us.imagemcs.com and de.imagemcs.com) and was looking for some assurance that this would not be the case. My understanding is that subdomains are completely distinct from domains and as such this action should have no impact on the subdomains.
I asked the question on the Webmasters Forum but haven't really got anywhere
https://productforums.google.com/forum/#!msg/webmasters/1Avupy3Uw_o/hu6oLQntCAAJCan anyone suggest a course of action?
many thanks,
Nathan
-
Sorry for delay - thanks for this - will most likely give your suggested steps a try!
-
Hi there,
Yeap, subdomains are different sites to google from the root domain.
What you are willing to do is to non index the root domain.
Of course one way is to make a 301 redirect, if you still want to preserve the root domain visible I'd do these steps:
1. Set canonical tag in the home to the sudomain
2. Set meta robots tag to noindex in the root domain
3. Update the sitemap, leaving only what you want in the root domain. Update in SC.
4. Create a SC profile for the subdomain, create a sitemap and upload it to SC.
5. WaitHope I've helped.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can we analyze about duplication?
Howdy all, We have a few pages being hailed as copies by the google search comfort. Notwithstanding, we accept the substance on these pages is unmistakably extraordinary (for instance, they have totally unique list items returned, various headings and so on) An illustration of two pages google discover to be copies is underneath. in the event that anybody can spot what may be causing the copy issue here, would especially see the value in ideas! Much appreciated ahead of time.
Technical SEO | | camerpon090 -
Disavow to all domains?
Hi there, I have several versions of my domain setup in Webmaster tools. Should I upload my disavow file against all of these domains? For example.....
Technical SEO | | niallfred
If I find a link pointing to: http://www.mydomain.com from: http://www.somespammysite.com do I need to add a disavow file in Webmaster tools for all my domain versions or only the version the offending links points towards? So... Only
http://www.mydomain.com
Or
http://www.mydomain.com
http://mydomain.com
https://www.mydomain.com
https://mydomain.com0 -
What should I consider before setting up a sub domain?
Morning all! We've just been approached by IT. They've been asked to develop an online 'portal' where clients can upload and download materials. IT will be developing a portal that sits on the company network perimeter (hosted on our internal servers). The concept is that 3<sup>rd</sup> parties can get and update information in regards to progressing cases, the first use will be for agencies who will retrieve records via the portal and then post reports after a consultation. however I would like to have an automatic link to forward to the portal from the web address: oursite.com/dave We will look to create robot.txt and anything else to prevent from listings/indexes. Does any of the above mess with your SEO? The Directors have asked if they can have this on a sub-domain of our site. Is this wise? And, are there any major SEO considerations for my team to worry about? Better still, have any of you had to deal with this before? If so, what happened? All the best, John
Technical SEO | | Muhammad-Isap0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
Searching on root domain words = ranking on > page 10 in SERP
Hello, Our website wingmancondoms.com (a new condom brand) is not ranking in Google on the keywords "wingman condom", and I don't know why. In Yahoo and Bing everything is allright. I saw on this forum that it is maybe best to change my language URL's to wingmancondoms.com/nl /de and /fr instead of a direct URL like http://www.wingmancondoms.com/wingman-kondome (german translation). But is this our problem or are there more problems. Google is indexing our page well, no errors etc. Any other possibilities?
Technical SEO | | jogo0 -
Linking root domains and youtube
All of my competitors have high linking root domains from youtube and our isn't showing up although we have 1.5 million views to youtube. I tried adding our URL to the videos but it hasn't recognized as a linking root domain. What should I do?? There's a ton of SEO juice here I want to tap into! watch?v=GTXFRTY4CCA&list=UUOcfF9LAHKedNSyk-gk5xDw&index=28
Technical SEO | | tonymartin0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
How I can deal with ajax pagination?
Hello! I would like to have your input about how I can deal with a specific page in my website You can see my page here As you can see, we have a list of 76 ski resort, our pagination use ajax, wich mean we have only one url, and just below the list, we have a simple list of all the ski resort in this mountain, which show all the 76 ski resorts.. I know it's quite bad, since we can reach the same ski resort with two différents anchors links. Thanks you very much in advance, Simon
Technical SEO | | Alexandre_0