Home Page Deindexed Only at Google after Recovering from Hack Attack
-
Hello, Facing a Strange issue, wordpress blog hghscience[dot]com was hacked by someone, when checked, I found index.php file was changed & it was showing some page with a hacked message, & also index.html file was added to the cpanel account.All pages were showing same message, when I found it, I replaced index.php to default wordpress index.php file & deleted index.htmlI could not find any other file which was looking suspicious. Site started working fine & it was also indexed but cached version was that hacked page. I used webmaster tool to fetch & render it as google bot & submitted for indexing. After that I noticed home page get deindexed by google. Rest all pages are indexing like before. Site was hacked around 30th July & I fixed it on 1st Aug. Since then home page is not getting indexed, I tried to fetch & index multiple time via google webmasters tool but no luck as of now. 1 More thing I Noticed, When I used info:mysite.com on google, its showing some other hacked site ( www.whatsmyreferer.com/ ) When Searching from India But when same info:mysite.com is searched from US a different hacked site is showing ( sigaretamogilev.by )However when I search "mysite.com" my site home page is appearing on google search but when I check cached URL its showing hacked sites mentioned above.As per my knowledge I checked all SEO Plugins, Codes of homepage, can't find anything which is not letting the homepage indexed.PS: webmaster tool has received no warning etc for penalty or malware.
I also noticed I disallowed index.php file via robots.txt earlier but now I even removed that. 7Dj1Q0w.png 3krfp9K.png
-
.htaccess file has nothing but
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>END WordPress
Installed Plugins
Yoast SEO, Google XML Sitemaps, Akismat, Udinra All Image Sitemap, Social Share Bar (Digg Digg Alternative), Jetpack by WordPress.com, AuthorH Review.
Apart from Yoast, it seems nothing can block site, and Yoast settings are fine, just disabled tag indexing & subpages along with author archive.
Problem is something else I guess
-
Hi Ankit,
Though I have checked for the pages you're serving to bots, could you please have a look at your .htaccess file once? Does it contains something like:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo)Do you have your code's copy in github or bitbucket or any other source code management tool? If yes, please scan last few commits thoroughly.
You can create a list of plugins installed recently. Remove them one by one and submit your home page URL to GWT for fetching a fresh copy it every time. Not sure what's the issue here, let's do hit-and-trial to deep dive a bit.
-
Hey Alan,
Do let me know if you find some solution or identify the problem.
-
That's what. Not able to find any good information to go next-step for this. But, still checking random things with a "hope".
-
Domaintools domain report shows no more info that could be helpful. Leaving me at a complete loss as to what else to check.
-
More info.
Because Nitin was able to run a ping and traceroute without problem, I went to DomainTools.com - the worlds leading resource for forensic digital investigative research. I use it whenever I am doing investigations for expert witness work I do.
When I ran the domain there, it had a screen-capture of the home page from June. So I submitted a refresh, and it came back as not being able to provide a screen-shot of the home page.
While not a smoking gun issue, it further clouds my trust in regard to whether the domain is actually functioning properly in the hosting environment as I originally thought it might not be.
I will run a deeper test to see if I can get more information, however I wanted to post this update because I believe it relevant.
-
Well, this is probably 1 of the most interesting issues an SEO can come across with. Google is showing different cached version in different countries. For me, that's strange too. Is that usual thing?
-
Nitin
Thanks for doing that - Now I'm stumped - I've never had Pingdom fail before with both ping and traceroute. And I now wonder if it's a non-issue, or part of the confused mess that Ankit referenced somehow.
-
That's right, its showing different cached versions in different countries. Just checked for US here. Screenshot attached.
-
I think that index.php disallowed was not an issue, I took suggestion and removed it but many sites disallow index.php via robots.txt to avoid duplicate content issue in site.com & site.com/index.php
here is an example - http://www.shoutmeloud.com/robots.txt
Still I did it about 10-12 days ago, fetched & submitted to index & also put rendering request.
Attaching current Screenshot of last rendering request.
I think some other issue, what's your view on that info:site.com showing some other hacked sites, how's this happening & sites are also changing. Its different in India, Different in US.
-
Ping and traceroute worked for me when I tried using my terminal (screenshot is attached).
Well, I agree that the problem is actually bigger. If you see its cached version on google, it was last cached on 16th Aug i.e after the issue of index.php/index.html was fixed by the admin (another screenshot attached).
I tried to see this page as googlebot as well, couldn't find the issue (wanted to check it for cloaking as well).
-
UPDATE TO MY ORIGINAL COMMENT
I initially found a problem doing a ping and traceroute test using Pingdom.com - both returned an "invalid host name" error, something I have not seen previously for both ping and traceroute simultaneously.
Nitin (see his comment below) did a similar test locally and found both to be okay. Though he has other thoughts.
I just wanted to clarify here now, that my original finding may not be a key to this issue, though I want to understand why my test came back that way...
-
You said you remove the index.php from the robots.txt. I just wanted to when did that happened? Because after removal, it usually took some time to get back in index (crawler need to recrawl the website accordingly).
My advice is to resubmit your robots.txt and updated sitemap.xml to Webmaster console and wait for the next crawl and this should be fixed.
Hope this helps!
-
Just sent SC, Nothing helped so far, Its quite strange that the info:domain.com is now showing some other hacked URL. SC attached.
-
It was quite strange for me as well, Just attached Screen Shot after fetching for 1 more time.
1 more thing I noticed, that info:mysite.com is not showing some other Hacked domain. Not sure How it's happening & why It's happening.
Sorry for the delay in reply, I was not getting email updates so I though no one answered my question.
-
Hi Ankit! Did Nitin's suggestions help at all? And are you able to share the screenshot he asked for?
-
Check the following, may be it'll help you resolve the issue:
https://moz.com/community/q/de-indexed-homepage-in-google-very-confusing
https://moz.com/community/q/site-de-indexed-except-for-homepage
-
That's really strange. Could you please share the screenshot when you're trying to fetch it as google in the GWT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Google webmaster showing 0 indexed, yet I can see them all them Google search?
I can see them all the pages showing up in Google when i search for my site. But in webmaster tools under the sitemaps section in the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed. Any idea why is this showing like this? I don’t really think it’s that important as the pages are still indexed, but it just seems odd. Please see in the image.
Technical SEO | | Perfect0070 -
Disavowing links cause over 2M pages deindexing
can disavowing links cause deindexing from google ? we had about 2.5M pages indexed until dec 30th , since then it s dropped to about 600K We received a unnatural link warning in july, got hit in september, since then we are suffering substantially in ranks and also in business revenue. We've used the disavow tool, and also got tons of links removed within the past 3 , 4 months. Since october we havent got any response from google about what is going on despite sending another recon in Nov 2012. Now the site is getting deindexed. what should we do at this point? Any help is greatly appreciated. here is the url http://goo.gl/Ai17f Thank you nick
Technical SEO | | orion680 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
150 products 301 redirect to HOME page, any impact?
If redirect 150 products to the home page, will I be penalize? It does not look like I can have access to cPanel on this platform, BigCommerce and i moves my old domain to this plateforme, the only option I might have his to redirect everythings to home page. Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
Google Page speed
I get the following advice from Google page speed: Suggestions for this page The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and 77.1KiB. http://www.irishnews.com/ http://www.irishnews.com/index.aspx I'm not sure how to fix this the default page is http://www.irishnews.com/index.aspx, anybody know what need to be done please advise. thanks
Technical SEO | | Liammcmullen0 -
Is creating backlinks to Google places pages worth the time and money involved?
I have worked on a website and organically it is starting to do fine. The website itself is on the right track. Now, the places page, could use a little improvement. I did make sure it has the right categories, has all unique pictures and videos, it does have a good amount of reviews and even citations from other local directories, and even the website links to it. It does show up for some local searches but I would like it to dominate more. I've heard that if I've built links to that Google Places local page from other sources, it would rank higher and perform better. Is that true? Any other tips and tricks to make it perform better? Thank you
Technical SEO | | Boogily0