Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Noindex, Nofollow to previous domain
-
Hi,
My programmer recently did a horrible mistkae by adding noindex, nofollow to our website without me noticing for two days.
At the same time he did it we bought a new domain and redirected the old domain to the new domain:
The Old domain is: http://www.websitebuildersworld.com
and the new one is: http://www.websiteplanet.com
Now unfortunatly I didn't notice the noindex,nofollow when it was on the old domain and I redirected it to websiteplanet.com before I fixed the noindex, nofollow.
I fixed the problem around 10 hours ago on the new domain (www.websiteplanet.com)
but the old domain didn't get indexed back (yet), so for example if you search for WebsiteBuildersWorld in google you will not reach the homepage as google deleted it because of the noindex,nofollow.
My question is:
Do you think that it will be fixed and google will retrieve websitebuildersworld homepage to his search results and then redirect it to websiteplanet?Or because I redirected websitebuildersworld.com to websiteplanet.com before letting google crawling websitebuildersworld.com without the noindex,no follow it wouldn't get indexed again?
I hope I explained the problem good enough.
Looking forward for your valuable replies.
Thanks.
-
Hi Andrea,
Thanks for your replies.
I decided to retrieve the old domain and do 302 redirect from the new domain to the old one.
I will let google index the old one completely once again and only then i will do 301.
Would love to hear what you think about that.
Thanks,
Eliran. -
Here's the concept at its core: how can Google crawl redirects and index new pages if it can't crawl those redirects to get to the new pages and process the 301s?
Fix that to fix your problem. The link I shared has a lot of good comments very centered on this general topic.
And, I am intentionally avoiding giving an absolute solution to you because, quite frankly, I don't know enough or am involved at all in your site to feel comfortable doing so. Strategically, I'm happy to share ideas/best practices.
-
Hi **Andrea,
Thanks for your reply. **I have no worries about google getting me back to my rankings, I am sure he will.
The main problem is as you quoted: "In order for Google to index your new site it has to re-crawl the old site which is redirected there. As each url is accessed, the redirection is found and applied."
Are you suggesting that I need to put websitebuildersworld.com domain backup and let google re-crawl it and only then redirect it?
Thanks,
Eliran. -
The reason that comes up to my mind is that basically I didn't let google see WebsiteBuildersWorld.com without the noindex,nofollow removal fix so he wouldn't know what to redirect or something like that because the last time he visited websitebuilderworld.com he saw noindex,nofollow and now he can't visit it anymore because he is being redirected to websiteplanet.com
-
"maybe I need to upload the website with the old domain again and let google re-index it and only then do the 301, what do you think about that ?"
I'm not 100% certain, but I can't think of any reason you would need to do that.
-
Hi Adam,
Yes this is what I thought.
But I also had a weird thought that maybe I need to upload the website with the old domain again and let google re-index it and only then do the 301, what do you think about that ?As for a' and b' yes I will do that.
-
I think I get what you mean and this stuff can get a bit tricky - first and foremost, it can take days/weeks/months to get things unclogged after an issue like this and there's no promise you'll get exactly the same ranking as you had before.
Getting back to your original question, and not to kick you when you are down, however, Google never recommends moving an entire site at once because you don't catch major things like this. Now, to your question, here's answer: "In order for Google to index your new site it has to re-crawl the old site which is redirected there. As each url is accessed, the redirection is found and applied." I think that's what you are trying to get at?
There's more info here that may be worth you reading through: http://googlewebmastercentral.blogspot.com/2008/04/best-practices-when-moving-your-site.html
-
I think I understand. Since your site was de-indexed, Google has to start over indexing your site on the new domain. This is what should happen:
Google will follow any external links it finds pointing to your site, will find the 301 redirect, and will follow that to your new site. Google will then crawl your new domain. Google will "forward" most of the link juice from your backlinks to your new domain.
Via your internal link structure, the forwarded PageRank will be spread throughout your site. This will hopefully result in you regaining the rankings you previously had.
I assume you have forwarded each subpage on the old domain to the same page on the new domain?
I would also:
a) if you can, change over at least some of your backlinks to point to your new domain
b) build/attract links to your new domain
-
The thing is that I didn't 'give' google the chance to index the website again with the old domain after I fixed the noindex,nofollow.
Quite hard to explain, but do you get what I mean?
-
Oh, OK. Then I would say: yes, you should regain your rankings, though it's possible it will take time. Some SEOs have reported it takes several months to regain their rankings after switching domains, but I personally have not had that issue.
-
Hi Adam,
Thanks for your reply, but it wasn't really my question I afraid.
The thing is that I wonder if google will index back all our results and put them back in their spots and just redirect to the new domain.
Thanks,
Eliran. -
Google is not going to index http://www.websitebuildersworld.com, because it redirects to http://www.websiteplanet.com. Google won't index a domain that redirects to another domain. It will index the domain where the content is hosted.
-
Hi,
Thanks for your reply, much appreciated.
Yes the sitemap is submitted in WMT, thd old domain sitemap and the new domain sitemap.
So in your opinion everything should be back to normal, correct?
and yes, very big stuff , he uploaded the Header from the demo file with the noindex,nofollow... caused me to lose a lot of money and I around 80% of my pages including homepage got deleted from SERP's.
-
Go into WMT if you have an account and resubi your sitemap for websitebuildersword.com, or simply google suggest site or something similar and find where you can submit your site to google again.
It should get indexed again anyway, because you should have some links out there somewhere that the bots will detect and go to your site from.
Quite a big stuff up though, on your programmers part.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting from a "bad" domain "infect" the new domain?
Hi all, So a complicated question that requires a little background. I bought unseenjapan.com to serve as a legitimate news site about a year ago. Social media and content growth has been good. Unfortunately, one thing I didn't realize when I bought this domain was that it used to be a porn site. I've managed to muck out some of the damage already - primarily, I got major vendors like Macafee and OpenDNS to remove the "porn" categorization, which has unblocked the site at most schools & locations w/ public wifi. The sticky bit, however, is Google. Google has the domain filtered under SafeSearch, which means we're losing - and will continue to lose - a ton of organic traffic. I'm trying to figure out how to deal with this, and appeal the decision. Unfortunately, Google's Reconsideration Request form currently doesn't work unless your site has an existing manual action against it (mine does not). I've also heard such requests, even if I did figure out how to make them, often just get ignored for months on end. Now, I have a back up plan. I've registered unseen-japan.com, and I could just move my domain over to the new domain if I can't get this issue resolved. It would allow me to be on a domain with a clean history while not having to change my brand. But if I do that, and I set up 301 redirects from the former domain, will it simply cause the new domain to be perceived as an "adult" domain by Google? I.e., will the former URL's bad reputation carry over to the new one? I haven't made a decision one way or the other yet, so any insights are appreciated.
Intermediate & Advanced SEO | | gaiaslastlaugh0 -
Sub Domain Usage
I see that the gap uses gap.com, oldnavy.gap.com and bananarepublic.gap.com. Wouldn't a better approach for SEO to have oldnavy.com, bananarepublic.com and gap.com all separate? Is there any benefit to using the approach of store1.parentcompany.com, store2.parentcompany.com etc? What are the pros and cons to each?
Intermediate & Advanced SEO | | kcb81780 -
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Hosting images on multiple domains
I'm taking the following from http://developer.yahoo.com/performance/rules.html "Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org" What I want to do is load page images (it's an eCommerce site) from multiple sub domains to reduce load times. I'm assuming that this is perfectly OK to do - I cannot think of any reason that this wouldn't be a good tactic to go with. Does anyone know of (or can think of) a reason why taking this approach could be in any way detrimental. Cheers mozzers.
Intermediate & Advanced SEO | | eventurerob0 -
Noindex,follow is a waste of link juice?
On my wordpress shopping cart plugin, I have three pages /account, /checkout and /terms on which I have added “noindex,follow” attribute. But I think I may be wasting link juice on these pages as they are not to be indexed anyway, so is there any point giving them any link juice? I can add “noindex,nofollow” on to the page itself. However, the actual text/anchor link to these pages on the site header will remain “follow” as I have no means of amending that right now. So this presents the following two scenarios – No juice flows from homepage to these 3 pages (GOOD) – This would be perfect then, as the pages themselves have nofollow attribute. Juice flows from homepage to these pages (BAD) - This may mean that the juice flows from homepage anchor text links to these 3 pages BUT then STOPS there as they have “nofollow” attribute on that page. This will be a bigger problem and if this is the case and I cant stop the juice from flowing in, then ill rather let it flow out to other pages. Hope you understand my question, any input is very much appreciated. Thanks
Intermediate & Advanced SEO | | SamBuck1 -
Buying a banned domain
Hello all, I've found a exact match keyword domain that I'm able to buy. Problem is that I'm under the impression it might have been banned by google, currently it is only showing adsense without content. The site can't be found using the cache: or site: parameters in Google and the PR is 0. What are your experiences on buying a banned domain and how can I double check if the domain is banned? This blogpost suggests I should not buy it, any other opinions? Thanks. Hellemans
Intermediate & Advanced SEO | | hellemans0