Subdomain replaced domain in Google SERP
-
Good morning,
This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below:
Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP.
Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall.
Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index.
Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again?
Thank you for your time,
Chase
-
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
No worries, that's what this community is here for!
Google views subdomains as different entities. They have different authority metrics and therefore different ranking power. Removing a URL on a subdomain won't have any affect on it's brother over on a different subdomain (for example: dev. and www.).
Good call to keep the disallow: / on the dev.chiplab.com/robots.txt file - I forgot to mention that you should leave it there, for anti-crawling purpose.
This is the query you'll want to keep an eye on. The info: operator is new and can be used to show you what Google has indexed as your 'canonical' homepage.
-
Hi Logan,
Last follow-up. I swear.
Since I'm pretty new to this I got scared and cancelled the 'dev.chiplab.com' link removal request. I did this because I didn't want to go up 14 days without any traffic (this is the estimated time I found that the Google SERP can take to be updated even though we "fetched as GoogleBot in GWT). May be wrong on the SERP update time?
So what I did was add a 301 permanent redirect from 'dev.chiplab.com' to 'www.chiplab.com'. I've kept the NOFOLLOW/NOINDEX header on all 'dev' subdomains of course. I've kept the DISALLOW in robots.txt for the dev.chiplab.com site specifically. So now I just plan on doing work in the 'dev' site (because I can't test anything with the redirects happening). And then hopefull in 14 days or so the domain name will change gracefully in the Google SERP from dev.chiplab.com to www.chiplab.com. I did all of this because of how many sales we would lose if it took 14 days to start ranking again for this term. Good?
Best,
Chase
-
You should be all set# I wouldn't worry about link equity, but it certainly wouldn't hurt to keep an eye on your domain authority over the next few days.
-
Hi Logan,
Thanks for fast reply!
We did the following:
- Added NOINDEX on the entire subdomain
- Temporarily removed 'dev.chiplab.com' using Google Webmaster Tools
- Password protected 'dev.chiplab.com'
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right? Do we now just wait until GoogleBot crawls 'www.chiplab.com' and hope that it is restored to #1?
Thank you for your time (+Shawn, +Matt, +eyqpaq),
Chase
-
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
Just like Chase said, noindex your dev site to let the search engines know that it should not show in search. I do this on my dev sites everytime.
-
The most ideal method would be to make the dev page password protected. What I would do is to 301 redirect the dev page to the subsequent correct site pages and then when the SERP refreshes, I'd make the dev site a password protected site.
-
Hi Chase,
Removing the subdomain within Search Console (WMT) will not remove the rest of your WWW URLs. Since you have different properties in Search Console for each, they are treated separately. That removal is only temporary though.
The most sure-fire way to ensure you don't get dev. URLs indexed is to put a NOINDEX tag on that entire subdomain. NOFOLLOW simply means that links on whatever page that tag is on won't be followed by bots.
Remember, crawling and indexing are different things. For example, if on your live www. site you had an absolute link somewhere in the mix that had dev.chiplab.com in it, since you presumably haven't nofollowed your live site, a bot will still access that page. The same situation goes for a robots.txt disallow. That only prevents crawling, not indexing. In theory, a bot can get to a disallowed URL and still index it. See this query for an example.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silly Question still - Because I am paying high to google adwords is it possible google can't rank me high in organic?
Hello All, My ecommerce site gone in penalty more than 3 years before and within 3 months I got message from google penalty removed. Since then till date my organic ranking is very worst. In this 3 years I improved my site onpage very great. If I compare my site with all other competitors who are ranking in top 10 then my onpage that includes all schema, reviews, sitemap, header tags, meta's etc, social media, site structure, most imp speed, google page speed insight score, pingdom, w3c errors, alexa rank, global rank, UI, offers, design, content, code to text raito, engagement rate, page views, time on site etc all my sites always good compare to competitors. They also have few backlinks I do have few backlinks only. I am doing very high google adwords and my conversion rate is very very good. But do you think because I am paying since last 3 year high to google because of that google have some setting or strategy that those who perform well in adwords so not to bring up in organic? Is it possible I can talk with google on this? If yes then what will be the medium of conversation? Pls give some valuable inputs I am performing very much in paid so user end site is very very well. Thanks!
Intermediate & Advanced SEO | | pragnesh96390 -
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Is our Third Party Subdomain hurting our SERPs?
Hello! Our Moz report under the root domain godelta.com displays 696 high priority issues that we cannot control that are all caused by a third party subdomain. promotionalproducts.godelta.com We don’t have any control of the SEO on the third party website. Our blog posts link to the third party subdomain from our blog subdomain. blog.godelta.com Is the third party subdomain affecting our SERP and should we replace the subdomain with its own domain name? Hopefully we can clear this up and end the debate with our internal team and our HubSpot account manager. David
Intermediate & Advanced SEO | | wakadaca0 -
AngularJS - How does Google go?
We're rebuilding our entire website in angularJS. We've got it rendering fine in WMT, but does that mean that it's content is detectable? I've looked into prerender.io and that seems like a great solution to the problem of not seeing any static HTML, but is it really necessary? I'm looking into this as I'm having the argument currently with my devs, and they're all certain that Google renders angularJS fine.
Intermediate & Advanced SEO | | localdirectories0 -
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
Redirect multiple domains to a primary domain
Hello that such I make the following query imagine we have three domains on the same thematic or category primary domain: domain-antiguo1.com (3 years) (200 Backlink), domain-antiguo2.net (10 years) (1000 Backlinks) and domain-antiguo3.com (6 years) (500 Backlinks) and decide to redirect all these domains favorite one: domain-principal.com The three domains registered refeccionar this google webmaster, has its respective income sitemap and google webmaster area change of address to the main domain the three domains are my property It would have a penalty for doing this practice?
Intermediate & Advanced SEO | | globotec0 -
Clickable links in Google SERP
Hi, I came across clickable links appearing below the meta description in Google SERP. The links are not necessarily the top pages for that particular domain and sometimes tend to reflect the newly added stock. Is there a way to determine what factors account for this? Thanks, mryxN.png
Intermediate & Advanced SEO | | RaksG0 -
GOOGLE Rankings
We built a site for one of our clients about 3 years ago and added Meta tags to each page of the site. But when you do a google search on the keywords the site still does not come up. I thought it would be spidered into the system by now. Web Site - www.specialtysealgroup.com Let me know what we can do to improve our GOOGLE rankings.
Intermediate & Advanced SEO | | thine1230