Page disappears from Google search results
-
Hi, I recently encountered a very strange problem.
One of the pages I published in my website ranked very well for a couple of days on top 5, then after a couple of days, the page completely vanished, no matter how direct I search for it, does not appear on the results, I check GSC, everything seems to be normal, but when checking Google analytics, I find it strange that there is no data on the page since it disappeared and it also does not show up on the 'active pages' section no matter how many different computers i keep it open.I have checked to page 9, and used a couple of keyword tools and it appears nowhere!
It didn't have any back links, but it was unique and high quality. I have checked on the page does still exist and it is still readable. Has this ´happened to anyone before?
Any thoughts would be gratefully received.
-
@joelssonmedia said in Page disappears from Google search results:
I've had quite a lot of similar cases like this. The reason it happens is that Google crawls and indexes the pages in many phases. First phase is URL discovery, where it just discovers a new URL via links on your site or sitemap.
Second phase is indexing and first-pass deduplication. If your page has a good quality (in google's eyes) and is not a duplicate to another page on your site, it usually gets indexed. If your page does not pass this phase - it usually appears in "Crawled but not indexed" section.
Third and further phases are connected to further deduplication algorithms, and very few people know exactly what's "under the hood". One of the discoveries that I've had is that if a page does not have sufficient number of incoming links from your site, and Googlebot is not able to discover it from your homepage, it usually won't get indexed. So, my recommendation is:-
check if your page has incoming links from your site and can be discovered from homepage. If not, put a link to your page from your homepage or from one of the first-level pages, and see if it resolves your issue.
-
check if your page is accessible with JavaScript turned off (you can do it in browser settings or with one of the Chrome plugins). In many cases, you may see that the content of the page appears only partially, or may be even completely blank. If yes, speak to you developers to make sure that the page is fully rendered with JS turned off.
-
if nothing works, try to change the top paragraphs of your content and titles/headers and change the URL slightly, while making sure that there are no issues with points (1) and (2). It may happen, that during the crawl, search bot received a server error, in this case you can either wait for the next crawl (I don't know how often Googlebot crawls your website, it usually depends on how high is the Domain Authority of your site), or resubmit the page, as I suggested.
Wishing you luck! These indexing issues are always the trickiest to figure out. If nothing works, feel free to post some more details and we'll try to figure out.
-
-
@joelssonmedia Hi Joe, I will be happy to take a look at it but I would need to see what you're doing, to be honest with you without that I can only guess it could be something like /robots.txt
Robots.txt file URL: www.example.com/robots.txt
User-agent: * Disallow: /
=Blocking all web crawlers from all content
https://moz.com/learn/seo/robotstxtHope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google page speed testing failures
When running our domain through google's page speed insights I'm getting the error message 'An error occurred while fetching or analyzing the page' at around 65% load. I'm concerned it's affecting our organic rankings. The domain is https://www.scottscastles.com/ When testing in https://testmysite.withgoogle.com/ it is also failing at around 70% with the message 'It's taking longer than expected. You can leave this tab open and check back in a little while. We'll have your results soon.' but the results never come. I've tried testing on a few different speed testing sites without failures (https://tools.pingdom.com, https://gtmetrix.com, https://www.webpagetest.org and a few others). We’re stumped as everything appears correct and was working but now isn't. Is this Google or us, or a combination of the two? Any help greatly appreciated!
Technical SEO | | imaterus0 -
How do I "undo" or remove a Google Search Console change of address?
I have a client that set a change of address in Google Search Console where they informed Google that their preferred domain was a subdomain, and now they want Google to also consider their base domain (without the change of address). How do I get the change of address in Google search console removed?
Technical SEO | | KatherineWatierOng0 -
How do I influence what page on my site google shows for specific search phrases?
Hi People, My client has a site www.activeadventures.com. They provide adventure tours of New Zealand, South America and the Himalayas. These destinations are split into 3 folders in the site (eg: activeadventures.com/new-zealand, activeadventures.com/south-america etc....). The actual root folder of the site is generic information for all of the destinations whilst the destination specific folders are specific in their information for the destination in question. The Problem: If you search for say "Active New Zealand" or "Adventure Tours South America" our result that comes up is the activeadventures.com homepage rather than the destination folder homepage (eg: We would want activeadventures.com/new-zealand to be the landing page for people searching for "active new zealand"). Are there any ways in influence google as to what page on our site it chooses to serve up? Many thanks in advance. Conrad
Technical SEO | | activenz0 -
Not showing the right results in Google.de
Hi Moz, I have a question concerning Vintykids.com.
Technical SEO | | B.Great
The site comes in four languages: German on vintykids.com/de Dutch on vintykids.com/nl English on vintykids.com/en French on vintykids.com/fr The German language gives a problem. In Google.de (German Google) the site is completely indexed in German but we also see results in Dutch. So when you do a search in Google.de on their brandname (Vintykids) we see results in Dutch on vintykids.com.
We think we have set the meta's right in the German language on vintykids.com/de and we also managed some links to Vintykids.com/de from good quality and relevant German sites. What can we do further to get vintykids.com/de ranking in Google.de on the brandname? Thank you.0 -
I have custom 404 page and getting so much 404 error on Google webmaster, what should i do?
I have a custom 404 page with popular post and category links in the page, everyday i have 404 crawl error on webmaster tools, what should i do?
Technical SEO | | rimon56930 -
How narrowly geo targeted should your Google Places page be?
Hi Mozers I'm still struggling with my London based client with two locations and one business. Basically she has a location in W1W 'Westminster' and a location in 'WD!' Borehamwood. Has anyone any good resources of input concerning geotargeting. I've done some searching but can't get quite the help I'm seeking. I'd like to make the Pages cover a 5mile radius and be highly specific to their locations. Is this the right way to proceed? Thanks
Technical SEO | | catherine-2793880 -
Google Search memory
Hi we have had the following statement from a member of our Japan office with regards google displaying search results, would anyone be able to give us a definitive answer on this. Google remembers previous non-mobile related searches For example, we already know that we come up on the first page if you select “kaigai keitai” (mobile phone for use abroad) and “UK” where as we don’t for searches where you replace the UK with the US or other countries. This means that if a customer, for example, does a search just on the UK e.g. using words like UK travel, London, millennium dome, etc. and then does a separate search just using the words “kaigai keitai” that google could show us as a link on the first page. However, if an individual did a search on Paris, France, Eiffel Tower, and then did a search for “kaigai keitai”, our link might not appear on the page. I don’t know if we have tested this already, but Google seems to have a very long “memory” and I could see this kind of aspect of Google resulting in us missing significant business from people going to the US, France, Italy, etc. Any thoughts?
Technical SEO | | -Al-0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0