Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain replaced domain in Google SERP
-
Good morning,
This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below:
Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP.
Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall.
Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index.
Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again?
Thank you for your time,
Chase
-
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
No worries, that's what this community is here for!
Google views subdomains as different entities. They have different authority metrics and therefore different ranking power. Removing a URL on a subdomain won't have any affect on it's brother over on a different subdomain (for example: dev. and www.).
Good call to keep the disallow: / on the dev.chiplab.com/robots.txt file - I forgot to mention that you should leave it there, for anti-crawling purpose.
This is the query you'll want to keep an eye on. The info: operator is new and can be used to show you what Google has indexed as your 'canonical' homepage.
-
Hi Logan,
Last follow-up. I swear.
Since I'm pretty new to this I got scared and cancelled the 'dev.chiplab.com' link removal request. I did this because I didn't want to go up 14 days without any traffic (this is the estimated time I found that the Google SERP can take to be updated even though we "fetched as GoogleBot in GWT). May be wrong on the SERP update time?
So what I did was add a 301 permanent redirect from 'dev.chiplab.com' to 'www.chiplab.com'. I've kept the NOFOLLOW/NOINDEX header on all 'dev' subdomains of course. I've kept the DISALLOW in robots.txt for the dev.chiplab.com site specifically. So now I just plan on doing work in the 'dev' site (because I can't test anything with the redirects happening). And then hopefull in 14 days or so the domain name will change gracefully in the Google SERP from dev.chiplab.com to www.chiplab.com. I did all of this because of how many sales we would lose if it took 14 days to start ranking again for this term. Good?
Best,
Chase
-
You should be all set# I wouldn't worry about link equity, but it certainly wouldn't hurt to keep an eye on your domain authority over the next few days.
-
Hi Logan,
Thanks for fast reply!
We did the following:
- Added NOINDEX on the entire subdomain
- Temporarily removed 'dev.chiplab.com' using Google Webmaster Tools
- Password protected 'dev.chiplab.com'
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right? Do we now just wait until GoogleBot crawls 'www.chiplab.com' and hope that it is restored to #1?
Thank you for your time (+Shawn, +Matt, +eyqpaq),
Chase
-
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
Just like Chase said, noindex your dev site to let the search engines know that it should not show in search. I do this on my dev sites everytime.
-
The most ideal method would be to make the dev page password protected. What I would do is to 301 redirect the dev page to the subsequent correct site pages and then when the SERP refreshes, I'd make the dev site a password protected site.
-
Hi Chase,
Removing the subdomain within Search Console (WMT) will not remove the rest of your WWW URLs. Since you have different properties in Search Console for each, they are treated separately. That removal is only temporary though.
The most sure-fire way to ensure you don't get dev. URLs indexed is to put a NOINDEX tag on that entire subdomain. NOFOLLOW simply means that links on whatever page that tag is on won't be followed by bots.
Remember, crawling and indexing are different things. For example, if on your live www. site you had an absolute link somewhere in the mix that had dev.chiplab.com in it, since you presumably haven't nofollowed your live site, a bot will still access that page. The same situation goes for a robots.txt disallow. That only prevents crawling, not indexing. In theory, a bot can get to a disallowed URL and still index it. See this query for an example.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain cannibalization
Hi, I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example: Root domain: example.com
Intermediate & Advanced SEO | | Mat_C
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,... Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains. The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop. Thanks.0 -
Replacing keywords by synonyms. Will it increase risk of google keyword stuffing penalization?
I have a page which is ranking already pretty well for a relative competitive keyword.
Intermediate & Advanced SEO | | lcourse
Google also ranks us on first page for synonym of keyword we optimize the page for (even though synonym does not appear on our page). I am now considering to replace some occurences of the keyword in the page by different synonyms, in the hope that our ranking may further improve for these synonyms.
However I am concerned that google may penalize me for keyword stuffing if I am using a wide range of synonyms of one keyword on our page. My plan is only to replace some occurences of keyword with synonyms. I am a bit nerveous here since page is already ranking quite well in a competitive niche. Any thoughts?0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
Microsites: Subdomain vs own domains
I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
Intermediate & Advanced SEO | | kinimod
weddingsbarcelona.com
surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!0 -
Sitemap on a Subdomain
Hi, For various reasons I placed my sitemaps on a subdomain where I keep images and other large files (static.example.com). I then submitted this to Google as a separate site in Webmaster tools. Is this a problem? All of the URLs are for the actual site (www.example.com), the only issue on my end is not being able to look at it all at the same time. But I'm wondering if this would cause any problems on Google's end.
Intermediate & Advanced SEO | | enotes0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
SEO value in multiple backlinks from same domain and from various sub-domains.
A site has a link to my site as one of their main tabs, which means whenever a user clicks through to another page within the site, my link - being a main tab - is there. This creates thousands of links from this site. How does Google treat this? Do we have a rough formula estimate. In other words, assume it creates 1,000 backlinks would the SEO value be around the same as if I had just 2 link total as a main tab, but on 2 different non-related sites? Or, does it actually count fully as 1,000 links? Links from various sub-domains. Several .EDU's are linking to my site. Different schools within the overall same university. Example: nursing.abc.edu links to my site, but so does business.abc.edu. For SEO does that count as much as if I had links from complete non-related universities, or would Google evaluate that these links are related (since same main domain) and that will discount any links more than 1 to some extent? If discounted, then what do we estimate the discount to be? thank yoyu
Intermediate & Advanced SEO | | knielsen1