Home Page Deindexed Only at Google after Recovering from Hack Attack
-
Hello, Facing a Strange issue, wordpress blog hghscience[dot]com was hacked by someone, when checked, I found index.php file was changed & it was showing some page with a hacked message, & also index.html file was added to the cpanel account.All pages were showing same message, when I found it, I replaced index.php to default wordpress index.php file & deleted index.htmlI could not find any other file which was looking suspicious. Site started working fine & it was also indexed but cached version was that hacked page. I used webmaster tool to fetch & render it as google bot & submitted for indexing. After that I noticed home page get deindexed by google. Rest all pages are indexing like before. Site was hacked around 30th July & I fixed it on 1st Aug. Since then home page is not getting indexed, I tried to fetch & index multiple time via google webmasters tool but no luck as of now. 1 More thing I Noticed, When I used info:mysite.com on google, its showing some other hacked site ( www.whatsmyreferer.com/ ) When Searching from India But when same info:mysite.com is searched from US a different hacked site is showing ( sigaretamogilev.by )However when I search "mysite.com" my site home page is appearing on google search but when I check cached URL its showing hacked sites mentioned above.As per my knowledge I checked all SEO Plugins, Codes of homepage, can't find anything which is not letting the homepage indexed.PS: webmaster tool has received no warning etc for penalty or malware.
I also noticed I disallowed index.php file via robots.txt earlier but now I even removed that. 7Dj1Q0w.png 3krfp9K.png
-
.htaccess file has nothing but
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>END WordPress
Installed Plugins
Yoast SEO, Google XML Sitemaps, Akismat, Udinra All Image Sitemap, Social Share Bar (Digg Digg Alternative), Jetpack by WordPress.com, AuthorH Review.
Apart from Yoast, it seems nothing can block site, and Yoast settings are fine, just disabled tag indexing & subpages along with author archive.
Problem is something else I guess
-
Hi Ankit,
Though I have checked for the pages you're serving to bots, could you please have a look at your .htaccess file once? Does it contains something like:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo)Do you have your code's copy in github or bitbucket or any other source code management tool? If yes, please scan last few commits thoroughly.
You can create a list of plugins installed recently. Remove them one by one and submit your home page URL to GWT for fetching a fresh copy it every time. Not sure what's the issue here, let's do hit-and-trial to deep dive a bit.
-
Hey Alan,
Do let me know if you find some solution or identify the problem.
-
That's what. Not able to find any good information to go next-step for this. But, still checking random things with a "hope".
-
Domaintools domain report shows no more info that could be helpful. Leaving me at a complete loss as to what else to check.
-
More info.
Because Nitin was able to run a ping and traceroute without problem, I went to DomainTools.com - the worlds leading resource for forensic digital investigative research. I use it whenever I am doing investigations for expert witness work I do.
When I ran the domain there, it had a screen-capture of the home page from June. So I submitted a refresh, and it came back as not being able to provide a screen-shot of the home page.
While not a smoking gun issue, it further clouds my trust in regard to whether the domain is actually functioning properly in the hosting environment as I originally thought it might not be.
I will run a deeper test to see if I can get more information, however I wanted to post this update because I believe it relevant.
-
Well, this is probably 1 of the most interesting issues an SEO can come across with. Google is showing different cached version in different countries. For me, that's strange too. Is that usual thing?
-
Nitin
Thanks for doing that - Now I'm stumped - I've never had Pingdom fail before with both ping and traceroute. And I now wonder if it's a non-issue, or part of the confused mess that Ankit referenced somehow.
-
That's right, its showing different cached versions in different countries. Just checked for US here. Screenshot attached.
-
I think that index.php disallowed was not an issue, I took suggestion and removed it but many sites disallow index.php via robots.txt to avoid duplicate content issue in site.com & site.com/index.php
here is an example - http://www.shoutmeloud.com/robots.txt
Still I did it about 10-12 days ago, fetched & submitted to index & also put rendering request.
Attaching current Screenshot of last rendering request.
I think some other issue, what's your view on that info:site.com showing some other hacked sites, how's this happening & sites are also changing. Its different in India, Different in US.
-
Ping and traceroute worked for me when I tried using my terminal (screenshot is attached).
Well, I agree that the problem is actually bigger. If you see its cached version on google, it was last cached on 16th Aug i.e after the issue of index.php/index.html was fixed by the admin (another screenshot attached).
I tried to see this page as googlebot as well, couldn't find the issue (wanted to check it for cloaking as well).
-
UPDATE TO MY ORIGINAL COMMENT
I initially found a problem doing a ping and traceroute test using Pingdom.com - both returned an "invalid host name" error, something I have not seen previously for both ping and traceroute simultaneously.
Nitin (see his comment below) did a similar test locally and found both to be okay. Though he has other thoughts.
I just wanted to clarify here now, that my original finding may not be a key to this issue, though I want to understand why my test came back that way...
-
You said you remove the index.php from the robots.txt. I just wanted to when did that happened? Because after removal, it usually took some time to get back in index (crawler need to recrawl the website accordingly).
My advice is to resubmit your robots.txt and updated sitemap.xml to Webmaster console and wait for the next crawl and this should be fixed.
Hope this helps!
-
Just sent SC, Nothing helped so far, Its quite strange that the info:domain.com is now showing some other hacked URL. SC attached.
-
It was quite strange for me as well, Just attached Screen Shot after fetching for 1 more time.
1 more thing I noticed, that info:mysite.com is not showing some other Hacked domain. Not sure How it's happening & why It's happening.
Sorry for the delay in reply, I was not getting email updates so I though no one answered my question.
-
Hi Ankit! Did Nitin's suggestions help at all? And are you able to share the screenshot he asked for?
-
Check the following, may be it'll help you resolve the issue:
https://moz.com/community/q/de-indexed-homepage-in-google-very-confusing
https://moz.com/community/q/site-de-indexed-except-for-homepage
-
That's really strange. Could you please share the screenshot when you're trying to fetch it as google in the GWT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google. Does anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
Technical SEO | | Chophel
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Will Google still ignore the second instance of anchor text on a page if it has an H2 tag on it?
We have a page set up that has anchor text with header tags. There is an instance where the same anchor text is on the page twice linking to the same page, and I know that Google will ignore the second instance. But in the second instance it also had an H2 tag (which I removed and put it on the first instance of anchor text even though it's smaller). Is this good practice?
Technical SEO | | AliMac260 -
How come only 2 pages of my 16 page infographic are being crawled by Moz?
Our Infographic titled "What Is Coaching" was officially launched 5 weeks ago. http://whatiscoaching.erickson.edu/ We set up campaigns in Moz & Google Analytics to track its performance. Moz is reporting No organic traffic and is only crawling 2 of the 16 pages we created. (see first and third attachments) Google Analytics is seeing hundreds of some very strange random pages (see second attachment) Both campaigns are tracking the url above. We have no idea where we've gone wrong. Please help!! 16_pages_seen_in_wordpress.png how_google_analytics_sees_pages.png what_moz_sees.png
Technical SEO | | EricksonCoaching0 -
Duplicate Version of Home Page Causing Problems?
Hello, I have a .php based site and i'm curious if how we split traffic is negatively affecting our rankings. Currently, if you visit Lipozene.com you are split 50/50 between two pages, indexa.php and indexb.php. These have identical content right now, and i'm curious if this has negatively affected our rankings. We've dropped off the SERPs for our brand term "lipozene" even though we are the official site and own www.lipozene.com . Any thoughts are greatly appreciated.
Technical SEO | | lipoweb0 -
What to do when you want the category page and landing page to be the same thing?
I'm working on structuring some of my content better and I have a dilemma. I'm using wordpress and I have a main category called "Therapy." Under therapy I want to have a few sub categories such as "physical therapy" "speech therapy" "occupational therapy" to separate the content. The url would end up being mysite/speech-therapy. However, those are also phrases I want to create a landing page for. So I'd like to have a page like mysite.com/speech-therapy that I could optimize and help people looking for those terms find some of the most helpful content on our site for those certain words. I know I can't have 2 urls that are the same, but I'm hoping someone can give me some feedback on the best way to about this. Thanks.
Technical SEO | | NoahsDad0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
We recently launched a beta version of our new website in a subdomain of our existing site. The existing site is www.fonts.com with the beta living at new.fonts.com. We do not want Google to crawl the new site until it's out of beta so we have added the following on all pages: However, one of our team members noticed that google is displaying results from new.fonts.com when doing an "site:new.fonts.com" search (see attached screenshot). Is it possible that Google is indexing the content despite the noindex, nofollow tags? We have double checked the syntax and it seems correct except the trailing "/". I know Google still crawls noindexed pages, however, the fact that they're showing up in search results using the site search syntax is unsettling. Any thoughts would be appreciated! DyWRP.png
Technical SEO | | ChrisRoberts-MTI0 -
Could Having Blog Posts as Home Page Cause Keyword Dilution?
Something I've never been a fan of is having a blog as the home page of a site. I've always thought that it's a bit like walking into someone's house through the kitchen out back.
Technical SEO | | WilliamBay
If it's a vistors first time, it can be a little disconcerting or ackward even if they are not familiar with the writers style. But something just dawned on me, and I'd love a second opinion on this. For websites that focus on multiple keywords (in my most of my client's case it's usually a mix of Wedding Photography, Engagement Photography, Portrait Photography, Family Photography, etc). A lot of these clients will include the photos in a blog post along with a snippet of text that may talk about the people they're photographing and maybe a bit about where they photographed. But they're usually optimizing for the overarching keyword (Wedding... Portrait..., etc as per above). Now I'm wondering if having three or 5 posts on the home page, where most of them are focusing on a specific keyword like New York Wedding Photographer, is actually diluting the keyword they are trying to rank for. My theory is that if I have them move their blog to a domain.com/blog, and solely focus on the desired keyword on the home page, that they would do substantially better in the SERPs. Can anyone subtantiate this? Thanks!0