Deindexed homepage by Google
-
I just noticed that my homepage was de-indexed by google.
Any thoughts would be appreciated.
-
No problem, let me know how it goes.
-
It does makes sense. Thank you for taking the time to reply, Dmitrii. Much appreciated.
-
Hi there.
I'm not seeing any messed up code or headers or anything like that. As Gaston said, it might be after-effects of broken plugin or something.
At this point, i'd recommend this: do manual fetch as google request, see if it comes up. If not, disable all the plugins, do new fetch and see if it comes back, re-enable plugins, fetch, see what happens. this would help you troubleshoot.
I recommend doing these steps on development server - basically replicate your website on like a subdomain URL - that way you can do all the plugin disabling etc and do all the fetches without affecting currently ranking site. Also, after replication on dev server and doing fetch right away, you'll know if your home page is just being still cached by google in the state of noindex due to previous mess.
Also, make sure that in sitemaps you have last-modified set to as recent time as possible, otherwise google won't update the page index, since it's technically considered cached, if not modified since last crawl.
Hope this makes sense.
-
Thank you so much for the reply, Gaston. Much appreciated. At this point, I'm more concerned about the rankings than the ssl. I didn't realize (obviously) about all the redirection that was required beyond just the main page, hence how I got myself into this situation. Never straightforward it seems. Thanks again.
-
Hello floretweddings,
Dont panic, what you are seeing is the result of Google trying to understand what your new site is about (the https version is a completely new site for google).
Sadly, you've sent mixed signals to Google. First redicting to HTTPs then undoing that redirect.
At the moment of writing this anwer the page is indexed, check it out using site: operator. Keep into consideration that right now both versions are live, the http and the https. Make the redirection (by plugin, by a hosting rule o by .htaccess file) or use a canonical to point every page to its https version.Take a look on these three articles. They might help you.
The Big List of SEO Tips and Tricks for Using HTTPS on Your Website - Moz Blog
The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download - AleydaSolis
Google SEO HTTPS Migration Checklist - SERoundtableHope I've helped.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
My Google Author Pic Disappeared
My Google author picture, which had been in place for a couple of years, disappeared from all SERP results recently. I checked, and rel=author attribution is valid on every post, as is the link to to the Google + authorship page (which contains a link back to the web site). When I test URL's in the structured data testing tool the picture appears. I'm out of troubleshooting ideas. Any suggestions welcome.
Technical SEO | | waynekolenchuk0 -
Closed Address Google Local
While there are some older conversations pertaining to Google Local/Plus, I am not sure if issue is a bit different. The company I work for at one time had two locations. Both are brick & mortar, physical locations. The factory closed several years ago. To my surprise, the old location is coming up in a few Google searches as a Google Plus page (actually just located it toward the end of last week.) It is currently unclaimed. There are a handful of citations out on the web as well. To remove the factory listing (the one we don't want, which I am pretty sure is confusing Google), what is the best approach? Remove/update citations for the old listing? And then claim it and suspend it using our Google Places account? It took a while to claim the listing we actually want and I just want to be sure we handle removing the old one correctly. Any insight or advice is appreciated!
Technical SEO | | SEOSponge0 -
Google Schema Code for Organisation
I've created the Google Schema code for an organisation. Should this go in the template HTML so it would be shown on all pages or just on the home page?
Technical SEO | | CharlBritton0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Google Alerts News Images
I have a Google Alert setup which is pulling information from a blog. I am receiving images as part of the alert. The issue that I am having is that the images have nothing to do with the blog post. Is there a way to control what images are received in the alert. From what I have gathered, if it grabs an image it should be part of the blog post.
Technical SEO | | ricknakao0