After entire site is noindex'd, how long to recover?
-
A programmers 'accidentally' put "name="robots" content="noindex" />" into every single page of one of my sites (articles, landing pages, home page etc). This happened on Monday, and we just noticed today.
Ugh...
We've fixed the issue; how long will it take to get reindexed? Will we instantly retain our same positions for keywords? Any tips?
-
The first thing I would do is login to my webmaster tools account and submit a reconsideration request. https://www.google.com/webmasters/tools/reconsideration?pli=1 this goes a long way with Google. Just explain everything as it happened and confirm that the problem got fixed.
I hope that helps!
-
The positioning was the same when the site was reincluded.
-
Did you see the same results in positioning as you had before they were yanked from the index?
-
One of my colleagues did something very similar a few weeks ago to a few key sections of a website and it only took around 4 hours for the main pages to be reincluded in the index. I don't know about a whole site, I would advise doing everything you can to get different sections re-crawled (getting links, social sharing, pushing google to index pages in webmaster tools etc). Good luck Paul
-
Right? Yeah we've done that, I just want to make sure my expectations are set correctly on a site-wide reindex. The realistic repositioning timeline, etc.
-
Wow that's quite a goof!
Just remove the noindex tags and submit your site for indexing in Google Webmaster Tools. I think you should regain your positions in the SERPs once you're indexed again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
I'm getting an error in Search Console that pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. Unfortunately I can't post images on here but I've linked some url's below. The page below in search console shows the error above... https://mixeddigitaleduconsulting.com/ As does this one. https://mixeddigitaleduconsulting.com/independent-school-marketing-communications/ However, this page does not have the error and is indexed by Google. The meta robots tag looks identical. https://mixeddigitaleduconsulting.com/blog/leadership-team/jill-goodman/ Any and all help is appreciated.
Technical SEO | | Sean_White_Consult0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Hreflang Tags - error: 'en' - no return tags
Hello, We have recently implemented Hreflang tags to improve the findability of our content in each specific language. However, Webmaster tool is giving us this error... Does anyone know what it means and how to solve it? Here I attach a screenshot: http://screencast.com/t/a4AsqLNtF6J Thanks for your help!
Technical SEO | | Kilgray0 -
Can I speed up removal of cache for 301'd page on unverified website?
I recently asked another website to remove a page from their website (I have no control over this website) and they have now 301'd this old URL to another - this is just what I wanted. My only aim now is to see the Google cache removed for that page as quickly as possible.
Technical SEO | | Mark_Reynolds
I'm not sure that asking the website to remove the url via WMT is the right way to go and assume I should just be waiting for Google to pick up the 301 and naturally remove the cache. But are there any recommended methods I can use to speed this process up? The old URL was last cached on 3 Oct 2014 so not too long ago. I don't think the URL is linked from any other page on the Internet now, but I guess it would still be in Google's list of URLs to crawl. Should I sit back and wait (who knows how long that would take?) or would adding a link to the old URL from a website I manage speed things up? Or would it help to submit the old URL to Google's Submission tool? URL0 -
How long does it take before a site is back in the SERP after a manual spamaction is revoked
Hi, A client of ours has a website with a manual spam action (duplicate content). Because they made some mistakes with redirects while moving the site from a URL to another google penitalized the site. We fixed the errors and the spamction is revoked. My question is how long it ussualy takes before the first results are back in the SERP. In WMT Google says "some time". But has anyone got some more information on it? Best Regards, Sam
Technical SEO | | U-Digital0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Strategy for recovering from Penguin
I have a web site that has been hit hard by the penguin update. I believe that main cause our problem has been links from low quality blogs and article sites with overly optimized keyword anchor text. Some questions I have are: I have noticed that we still have good ranking on long tail search terms on pages that did not have unnatural links. This leads me to believe that the penalty is URL specific, i.e. only URL with unnatural linking patterns have been penalized. Is that correct? Are URLs that have been penalized permanently tainted to the point that it is not worth adding content to them and continuing to get quality links to them? Should new contact go on new pages that have no history thus no penalty, or is the age of a previously highly ranked page still of great benefit in ranking? Is it likely that the penalty will go away over time if there are no more unnatural links coming in?
Technical SEO | | mhkatz0 -
Optimize flash site
Hello, How can we optimize a site like this - http://www.ziba.com.au/ . The whole site is in flash. What are the alternatives ?
Technical SEO | | seoug_20050