What happens when a de-indexed subdomain is redirected to another de-indexed subdomain? What happens to the link juice?
-
Hi all,
We are planning to de-index and redirect a sub domain A to sub domain B. Consequently we now need to d-index sub domain B also. What happens now to the link juice or page rank they gained from hundreds and thousands of backlinks? Will there be any ranking impact on main domain? Backlinks of these sub domains are not much relevant to main domain content.
Thanks
-
Exactly as you said.
I wonder what ranking fluctuation or dip we can expect with main domain due to this deindexing of sub domains. Someone claims that ranking will be dropped, but how? Still sub-domain "B" will be there with all backlinks. So the backlinks are there technically.
Please let me know your valuable thoughts on this.
Thanks
-
Hi there,
To confirm, sub-domain A has been indexed for a number of years and has gained a high number of backlinks in that time? However now you are now no-indexing subdomain A, 301 redirecting A to B, and then also no-indexing subdomain B?
thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
How to take down a sub domain which is receiving many spammy back-links?
Hi all, We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Does Link Exchanges and Reciprocal Links Is Dead - Now Days ?
Hello, As we know randfish Rand discusses the egress of old link building practices and the ingress of new (old) link _earning _strategies, Rand has also discussed on Link Exchanges and Reciprocal Links, I have few questions which r related to Link Exchanges and Reciprocal Links. Few Question **1) Does Now Days Reciprocal Links Are Important Or Not For Link Building Strategies. ** ** 2) Webmaster Has To Perform Reciprocal Links Or Not.** 3) Can Reciprocal Links Boost Search Engine Ranking. 4) Does Reciprocal Links Has Negative Impact On Search Engine. Regards,
Algorithm Updates | | sumit60
Sumit0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
FLASH vs HTML links in SEO
In terms of a small flash slideshow and having text and links on various slides within, is such text and links as easily index-able (or even at all) compared to static html text on a webpage?
Algorithm Updates | | heritageseo0 -
SEO Link building / Article Distributation
Quick question in regards to link building and OFF PAGE SEO... Why isn't Articles distributation via wire services considered a "Duplicate Content" issue by Google? i.e. If take the one article and post it accross 50 (do follow) websites, Press release sites, and blogs. I would love to hear your thoughts and feedback on this. Regard, Sammy
Algorithm Updates | | revsystems.com0