De-indexed homepage in Google - very confusing.
-
A website I provide content for has just suffered a de-indexed homepage in Google (not in any of the other search engines) - all the other pages remained indexed as usual. Client asked me what might be the problem and I just couldn't figure it out - no linkbuilding has ever been carried out so clean backlink profile, etc. I just resubmitted it and it's back in its usual place, and has maintained the rankings (and PR) it had before it disappeared a few days ago. I checked WMT and no warnings or issues there. Any idea why this might've happened?
-
Having disappeared, the homepage appeared immediately when I resubmitted it, then fell back in SERPs a little and is now achieving better SERPs than when it disappeared. I can't see any probs with the code / sitemap or anything. All in all rather confusing but (thankfully) resolved in a couple of minutes. Thanks for your brilliant feedback Thomas and for your help as well Marcus. Thanks to you, I'm well prepared should this problem hit one of my clients again
-
Hi Marcus,
That is great news to have. Not that it's great that it happened but Great that other people have seen this happen and recovered. Considering all the things going on right now I would say at a glitch is entirely possible.
I know Google makes a change to the page rank algorithm approximately every 3 months and weird things happen. Plus if you count the way people actually charge their page rank it's normally with some bar that's out of date.
Here, it is what I found it appears to be extremely common with no issues that would actually affect you like your SERPS rank for certain keywords.
http://productforums.google.com/forum/#!topic/webmasters/ffzgDbpHh14
http://www.search9.co.uk/why-has-my-pagerank-disappeared
http://www.webworkshop.net/google_faq.html
http://www.webmasterworld.com/google/4550136-3-30.
http://www.zemanta.com/blog/find-google-algorithm-updates-make-website-search-rank-drop/
Marcus very valuable contribution thank you.
a change to the page rank algorithm approximately every 3 months and weird things happen. Plus if you count the way people actually charge their page rank it's normally with some bar that's out of date.
Here, it is what I found it appears to be extremely common with no issues that would actually affect you like your SERPS rank for certain keywords.
http://productforums.google.com/forum/#!topic/webmasters/ffzgDbpHh14
http://www.search9.co.uk/why-has-my-pagerank-disappeared
http://www.webworkshop.net/google_faq.html
http://www.webmasterworld.com/google/4550136-3-30.
http://www.zemanta.com/blog/find-google-algorithm-updates-make-website-search-rank-drop/
the 3rd link down just like Marcus explained has a very similar story. I think very valuable contribution thank you Marcus your comment inspired me to look for more data.
All the best I hope this is of help,
Thomas -
Hey, Thomas's response really nails most of the obvious causes here but as a simple n=1 oppinion this happened to one of my clients last week. Homepage disappeared for around 48 hours. No problems I could detect, but it was not indexed. Weird, but just as I was scratching my chin it popped back up.
Check the obvious things, make sure there are no problems and then leave it for 48 hours and see if it pops back up.
Not hugely scientific but after such a big update we often see little aberrations and this could be one of those if everything else is as it should be.
Hope that helps
Marcus -
I have heard a lot of people complain about page rank loss during the recent Penguin 2 update. Unfortunately that could be just Google's way of changing their own page rank. if the page itself is unaffected rank wise I would not be worried. The other thing I would what your client know is those page rank toolbars are notoriously wrong so you might want to check it through a more powerful or trustworthy source. I don't really put a lot of stock in Google's page rank because they frankly have told us not to and I would look more at the Moz rank
if there is a problem with the robots.txt it would be very possible Google is slow to reinstate it after what happened.
I would still check site with everything below however this is something I've heard of already from 2 other people. So it could just be a glitch or Google has made changes to its page rank algorithm.
It sounds to me as if there's either a problem with the robots.txt
or it's a simple matter of the plug-in or whatever's being used saying not to index or follow the homepage
if you could use this tool and simply type in the URL plus robots.txt
like this below
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
then post the results or send me a private message with them included.
In addition go on the homepage and search with either control F if you're a PC user or command F if you are Mac user for nofollow and noindex
if you find either one of these
they will look like this
http://www.robotstxt.org/meta.html
<title>...</title> you can use the
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
to change them to make sure they're not on the page. You do not need them there as Google default indexes and follows every page unless you tell it not to. The other thing you have to look for use of the X tag as well it can do the same thing as shown below
Example uses of the X-Robots-Tag
If you want to prevent search engines from showing files you’ve generated with PHP, add the following in the header file:
|
1
|header("X-Robots-Tag: noindex", true);
|This would not prevent search engines from following the links on those pages, if you want to do that, do the following:
|
1
|header("X-Robots-Tag: noindex, nofollow", true);
|But doing it in PHP is probably not the easiest use for this kind of thing. I myself greatly prefer setting headers in Apache, when possible. Consider, for instance, preventing search engines from caching / showing a preview for all .doc files on your domain, you would only have to do the following:
take a look at this to know if you are pages being affected by one or the other.
http://yoast.com/x-robots-tag-play/#examples
I hope this is of help,
let me know if I can be of more assistance.
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Is there any way to prevent Google from using structured data on specific pages?
I've noticed that Google is now serving what looks like host-specific video cards on mobile for our site. Is there any way to control which videos are included in these lists without removing the structured data on those clip pages or user pages? We don't want to noindex those pages but we don't want content from those pages to appear as video cards. 1kzPW
Algorithm Updates | | Garrett570 -
Google's Local Search Results for Broad Keywords
I have a question regarding Google's local search results for broad keywords. Since Google is changing their algo to reflect local results for broad words, would it be beneficial now to start going after those words as well? For example: we have a client ranking for 'miami security alarm', but I would like to know if it would be beneficial to start optimizing for 'security alarm' as well. Also, since Google's keyword research tool reflects searches on a national level, how would I be able to find out how many searches a broad keyword is receiving on a local level? Thank you in advanced!
Algorithm Updates | | POPCreative0 -
What is this new feature on Google?
Hey everyone, I typed "Vancouver Colleges" into Google and this new feature came up which I have never seen. It displays popular schools. Could someone tell me what this is called? And how do I get our college on here? Thank you! 4hnT7bV.png
Algorithm Updates | | jhinchcliffe0 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
Google's not indexing my blog posts anymore! Why?
Google just recently stopped indexing my blog posts immediately after being published, why could this be? I would usually post a blog post and it would be in google results within 45 seconds, now they don't show up until 6 hours later, if at all (a few never even showed up). Also, my home page doesn't even refresh when I make a change to the site. My site is CantStopHipHop [dot] comI have all in one SEO, xml sitemap generator, and webmaster tools and nothing seemed irregular in the settings.I appreciate any thoughts/help/suggestions.
Algorithm Updates | | bb2550 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0