Homepage is deindexed in Google
-
We recently noticed that our primary page was de-indexed in Google. When looking in google search console there are no manual actions taken.
We did add a few new banners to the site but I have no idea why this would have negatively affected that site.
I did add a new page called https://enleaf.com/company/testimonials/ that had some duplicate testimonials that were also on the home page but have since removed that.
Not sure where to go from here.
-
Glad you found the culprit. I was going to mention that this was probably it. If Googlebot sees noindex anywhere on the page it can cause the whole page to be deindexed. Sometimes it's really tricky to find as well.
-
As much as I can
P.S. Please mark this answer as answered.
-
I'm glad it was an easy fix. I was freaking out!! Thanks for the help.
-
Wow. Weird.
-
Nah, third-party scripts, especially Twitter ones wouldn't affect that. Quite strange situation.
Well, request fetch again, and then submit to google. Should work.
-
Ah ha. Looks like that was it. That widget was pulling from a CDN that has a noindex tag in it. Removed that widget and I'm back!
-
Googlebot type: Desktop(render requested) - Partial on Sunday, January 29, 2017 at 9:56:00 PM PST
I did notice a script that has a no index. I wonder if that is part of it. I think that belongs to a social media widget.
-
what happens when you fetch within GSC?
-
I just discovered this last night when checking our rankings. The home page changes were done late last week so I'm not sure why only now those would effect that.
-
Howdy.
That is quite bizarre. I don't see anything wrong with code, headers or anything like that. When did you discover the problem? I wonder if you guys did something and fixed it without realizing, or the Google Search Console haven't updated the manual action yet.
Also, what happens if you do fetch as google on index page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content homepage - Google canonical 'N/A'?
Hi, I redesigned a clients website and launched it two weeks ago. Since then, I have 301 redirected all old URL's in Google's search results to their counterparts on the new site. However, none of the new pages are appearing in the search results and even the homepage has disappeared. Only old site links are appearing (even though the old website has been taken down ) and in GSC, it's stating that: Page is not indexed: Duplicate, Google chose different canonical than user However, when I try to understand how to fix the issue and see which URL it is claiming to be a duplicate of, it says: Google-selected canonical: N/A It says that the last crawl was only yesterday - how can I possibly fix it without knowing which page it says it's a duplicate of? Is this something that just takes time, or is it permanent? I would understand if it was just Google taking time to crawl the pages and index but it seems to be adamant it's not going to show any of them at all. 55.png
Technical SEO | | goliath910 -
Canonical error from Google
Moz couldn't explain this properly and I don't understand how to fix it. Google emailed this morning saying "Alternate page with proper canonical tag." Moz also kinda complains about the main URL and the main URL/index.html being duplicate. Of course they are. The main URL doesn't work without the index.html page. What am I missing? How can I fix this to eliminate this duplicate problem which to me isn't a problem?
Technical SEO | | RVForce0 -
Google My Business Service Area Question
Hello Moz Friends I just wanted to make sure I'm doing things correctly. On google my business your given the option to list your service area. I serve the entire state of Colorado with my internet marketing services. So I listed Colorado as my service area. but Moz Friends, is this the wrong idea? Like should I list the major cities and call it good? So instead of service area Colorado, I should put Denver, Colorado Springs, Pueblo etc Thank you for your friendly help Chris
Technical SEO | | asbchris0 -
Google Webmaster Tools Sitelinks Demotions
Does anyone know if the sitelinks demotions tool actually works. I went in and demoted a bunch of pages on my site but the pages are still showing up as sitelinks in search results. I did this like 2 months ago so I am assuming that is plenty of time for it to take. Any help with this would be great. Thanks!
Technical SEO | | ZiaTG0 -
Google ranking my site abroad, how to stop?
Hi Mozzers, I have a UK based ecommerce site, that sells only to the UK. Over the last month Google has started ranking my site on foreign flavours of Google, so I keep getting traffic coming to my site from Europe, America and the far east that we could never sell to, and as a result bounce is going up and engagement is going down. They are definitely coming to the site from google searches that relate to my product type, but in regions I do not service. Is there a way to stop google doing this? I have the target set to UK in WMT, but is there anything else I can do? I worried about my UK ranking being damaged by an increasing overall bounce rate. Thanks
Technical SEO | | FDFPres0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Where does Google pull the date stamp?
We're a news media site with content that has been live for a few years. All of a sudden, Google is showing our content (even though no one has touched the file) with a date stamp of '3 days ago'. Even for content that is years old. I checked the date it was last cached, and it doesn't even match. The URLs were last cached on January 16, but the date stamp says '3 days ago.' From where does Google pull the date stamp? Any ideas?
Technical SEO | | Aggie0 -
About Google Spider
Hello, people! I have some questions regarding on Google spider. Many people are saying that "Google spiders only have US IP address." Is this really true? But I also saw video from Google's offical blog and it said "Google spider come from all around the world." At this point I am really confused. Q1) I researched and it seems like Google spiders have only US IP address. THen what does exactly mean by "Google spider come from all around the world."? Q2) If Google spider have only US IP address, what happen to site which use IP delivery? Is this means that Google spider always redirect to us site since they only have US IP? Can anyone help me to understand?? One more questions! When Google analyzing for cloaking issue, do you think Google analyze when spider crawls the site or after they crawled the site?
Technical SEO | | Artience0