How long takes to a page show up in Google results after removing noindex from a page?
-
Hi folks,
A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results.
We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url
Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this.
I know that in some days it will appear but I'd like to have a good reference for the future.
Thanks.
-
Just to let you know that the page was indexed in less than 24hrs. We didn't use Tony's tiip (share on G+) but we did all the following:
- Used GWT tool - fetch as googlebot
- Submit the URL using the button that appears after fetching as googlebot
- Included some sidewide links to the page
- Included the page in our sitemap.xml
Thanks all folks who added some insights and tips!
-
Thanks for the tip Tony! We didn't try this yet.
-
Depends on the site, if the site is Microsoft.com with a link from the home page, you can expect it to appear same day.
If its on boringoldsite.com then it could take a week or more.
But mostly a few days -
You can do two things in Google Webmaster tools to identify how long it will take for a page to index or even speed up the process of re indexation:
- Use Google's crawl rate and indexation reports
2) google tools fetch as googlebot
-
Hi Fabio,
Share the page in question on G+. Indexation of G+ posts (including links) can be as quick as 1/2 hour. Also make sure the website is linked to from the clients main G+ profile as a custom link.
-
We had a sub domain website (very small... four or five pages) that was blocked via the robots.txt file for two or three years. When we decided to have it indexed I did just what you did; fetch via GWT and clicked the button to add it to the index. This worked and then the next day... or maybe two days later, it was gone. I did this a couple of times...
It didn't hit the index and stick for two weeks. But since then everything has been just fine.
-
One of my competitors had a designer put a new look on their website. As soon as they uploaded it we went to the site to sniff the code. We saw that the developer left the "noindex" on all of the files. We laughed and laughed about that. Within a few days their entire site dropped out of search and it took them a couple weeks to figure out what happened while we enjoyed a big increase in sales. But, when they uploaded the site with the noindex removed, within a few days the pages were mostly back in search and two weeks later they were back to normal.
The amount of time required is influenced by the amount of spider action received by the site. If your site has low PageRank and does not receive a lot of spider action you can go much longer without being reindexed. Deep pages on a site without much spider action can take weeks to come back. The site in the example above is a PR6 site with mostly PR3 and PR4 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Google take to completely authorise 301 redirect?
Will 301 redirect will have immediate impact once the website or that redirected link got indexed? We have recently redirected few links in the process of link reclamation and ranking dropped few days later. Every link we claimed is related to our topic (matched in content and URL) and they have good DA. Even though why it has happened? What are the general rank dropping factors in the process of link reclamation? Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Why some pages show schema and some don't in Google?
I notice Google displays the schema(reviews, price, availability etc.) in results only for some of our item pages in same category using same template. Any ideas why this is happening. They are created around same time - more than a year ago. Schema was also added a year ago.
Intermediate & Advanced SEO | | rbai0 -
How Google organic search results differ in Local Searches?
We all know Google displays nearby results by locating our ip address. My question is how does these results differ? For eg 1. If someone from Newyork search for "chinese Restaurant in Newyork" 2. Someone from California search for "chinese Restaurant in Newyork" 3. Someone from California changes his location to Newyork and search for "chinese Restaurant in Newyork" What are the factors the Google SERP looks into to display the result in local terms?
Intermediate & Advanced SEO | | rajeevEDU0 -
Fixing A Page Google Omits In Search
Hi, I have two pages ranking for the same keyword phrase. Unfortunately, the wrong page is ranking higher, and the other page, only ranks when you include the omitted results. When you have a page that only shows when its omitted, is that because the content is too similar in google's eyes? Could there be any other possible reason? The content really shouldn't be flagged as duplicate, but if this is the only reason, I can change it around some more. I'm just trying to figure out the root cause before I start messing with anything. Here are the two links, if that's necessary. http://www.kempruge.com/personal-injury/ http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ Best, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
How to remove hundreds of duplicate pages
Hi - while i was checking duplicate links, am finding hundreds of duplicates pages :- having undefined after domain name and before sub page url having /%5C%22/ after domain name and before the sub page url Due to Pagination limits Its a joomla site - http://www.mycarhelpline.com Any suggestions - shall we use:- 301 redirect leave these as standdstill and what to do of pagination pages (shall we create a separate title tag n meta description of every pagination page as unique one) thanks
Intermediate & Advanced SEO | | Modi0 -
Forced Trailing Slash on PHP Pages In Google!
Hi, have never seen this before so am hoping soemone can shed some light! Google seems to be forcing a trailing slash at the end of our php pages within the search results. For example, Google is indexing http://www.kingstoncabinets.co.uk/heat-convection.php/ (trailing slash) rather than http://www.kingstoncabinets.co.uk/heat-convection.php (without slash). The trailing slash page doesn't display properly and has no page authority. I've ran the site through Screaming Frog and there are no internal links the the trailing slash URL's, nor does there seem to be any external links to these pages. Why is Google forcing the slash? Am confused 😞
Intermediate & Advanced SEO | | Webpresence0 -
How does Google Webmasters decide what order to show external links?
In "links to your site" how does Google Webmasters determine the order of the URLs? By influence? Quality?
Intermediate & Advanced SEO | | nicole.healthline0 -
Google replacing subpages in index with home page?
Hi! I run a backlink building company. Recently, we had a customer who had us build targeted backlinks to certain subpages on his site. Then something really bizarre happened...all of a sudden, their subpages that were indexed in Google (the ones we were building links to) disappeared from the index, to be replaced with their home page. They haven't lost their rank, per se--it's just now their home page instead of their subpages. At this point, we are tracking literally thousands of keywords for our link building customers, and we've never run into this issue before. Have you ever run into it? If so, what's the best way to handle it from an SEO company perspective? They have a sitemap.xml and their GWT account reports no crawl errors, so it doesn't seem to be a site issue.
Intermediate & Advanced SEO | | ownlocal0