Manual action due to hack
-
We have had some issues with one of our websites getting hacked. The first time it happened, we noticed it the next morning and cleaned it up before Google even realised. However, the same thing happened again over the weekend, and I came into the office to an email from Google:
Google has detected that your site has been hacked by a third party who created malicious content on some of your pages. This critical issue utilizes your site’s reputation to show potential visitors unexpected or harmful content on your site or in search results. It also lowers the quality of results for Google Search users. Therefore, we have applied a manual action to your site that will warn users of hacked content when your site appears in search results. To remove this warning, clean up the hacked content, and file a reconsideration request. After we determine that your site no longer has hacked content, we will remove this manual action.
_Following are one or more example URLs where we found pages that have been compromised. Review them to gain a better sense of where this hacked content appears. The list is not exhaustive. _
We have again cleaned up the website, however, my problem is that even though we have received this email, I cannot find any evidence of the manual action having actually been applied. I.e. it doesn't show in the Search Console and I am also not getting a warning in the search results when searching for our own website or clicking on the result for our website. That means I cannot submit a reconsideration request - however I am not sure at all there was actually a manual action applied at all based on my test searches.
Has anyone here experienced the same issue? What do you suggest doing in this case?
Thank you very much in advance for any ideas.
-
You're welcome!
-
Thanks Joe. I will do that. Very helpful, I appreciate it!
-
I would keep an eye on organic performance for the next week or two (regularly checking the security issues/manual action reports). If you do not see a downward trend nor receive another message from Google, you should be all set here.
To review organic performance, I suggest monitoring:
-
Organic traffic (GA)
-
Organic Visibility Trends/Rankings (SEMRush, Moz rank tracker)
-
Google Search Console clicks and impressions (particularly for non-branded queries)
Hope this all helps!
-
-
It must have been, although I could also not see anything in Search Console before we cleaned up the hack.
I haven't seen it affect organic performance at all although it's hard to say as we are a B2B business and don't see as much traffic on weekends. Plus it's our corporate website which doesn't get much traffic to begin with.
-
If you are not seeing anything in the manual action report, security issues report or in the SERPs, I would say that Google has detected that the hack was addressed and has removed your manual action. Is organic performance still being impacted?
-
Hi Joe,
The report just says: "Currently, we haven't detected any security issues with your site's content." That's the problem, I had the email, but in Search Console there is no evidence of any hack (although we were definitely hacked, and it is now cleaned up).
Thanks!
-
Hello,
Did you review the Security Issues Report in Google Search Console? If you have a security issue/have been hacked, this is where you will submit a review once the issue has been cleaned up. This Google Webmasters post on hacked sites/requesting a review should help.
Malware or Spam
- Open the Security Issues report in Search Console. The report will probably still show the warnings and sample infected URLs you saw before.
- If you believe that the sample URLs listed are all clean, select Request a review. In order to submit a review, we ask that you provide more information that the site is cleaned of the hacker's damage. For example, for each category within Security Issues, you can write a sentence explaining how the site was cleaned (for example, "For Content injection hacked URLs, I removed the spammy content and corrected the vulnerability: updating an out-of-date plugin.").
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Worth redirecting non-www to www due to higher page authority with www?
When checking my domain I receive higher page authority for www vs non-www. I am considering moving to the www url and applying the necessary redirections but wanted to quickly check if this is worth it. The root page authority https://www.diveidc.com : PA 40 https://diveidc.com : PA 35 By redirecting would I just be transferring over negative signals to the www domain, thus voiding the point for doing any redirect at all?
Technical SEO | | MAGNUMCreative0 -
Some URLs were not accessible to Googlebot due to an HTTP status error.
Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.
Technical SEO | | GoldenRanking140 -
Can someone that had (or seen) a Manual Action in WMTs tell me.....
This is a repost of this question http://moz.com/community/q/manual-action-found-in-wmts-no-email-no-message-in-wmts But I'm sure there is someone in the moz fourms that have had/seen manual action 🙂 Someone I know said that they were looking though their WMTs and under Manual Actions they found they had a partial penalty. There is no date against it and they never got an email and there are no messages WMTs for it. I haven't personally dealt with a Manual penalty before, but I would have expected there to be a message in WMTs for it ( an email might have been missed because of a spam filter etc). Could it be a very old penalty?
Technical SEO | | PaddyDisplays0 -
Website content has been scraped - recommended action
So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action? Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content Ask them to remove the content. It is duplicate content and could hurt our website. Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping? What do you guys think?
Technical SEO | | maxweb0 -
Duplicated content in moz report due to Magento urls in a multiple language store.
Hi guys, Moz crawl is reporting as duplicated content the following urls in our store: http://footdistrict.com and http://footdistrict.com?___store=footdistrict_es The chain: ___store=footdistrict_es is added as you switch the language of the site. Both pages have the http://footdistrict.com" /> , but this was introduced some time after going live. I was wondering the best action to take considering the SEO side effects. For example: Permanent redirect from http://footdistrict.com?___store=footdistrict_es to http://footdistrict.com. -> Problem: If I'm surfing through english version and I switch to spanish, apache will realize that http://footdistrict.com?___store=footdistrict_es is going to be loaded and automatically it will redirect you to http:/footdistrict.com. So you will stay in spanish version for ever. Deleting the URLS with the store code from Google Web Admin tools. Problem: What about the juice? Adding those URL's to robots.txt. Problem: What about the juice? more options? Basically I'm trying to understand the best option to avoid these pages being indexed. Could you help here? Thanks a lot.
Technical SEO | | footd0 -
Do I have a manual penalty?
My rankings and traffic suddenly went down about 50% around the end of Feb 2013. I never received any warnings in webmaster tools (and as far as I know never did anything even vaguely black hat) but thought it might be a penalty since the drop was so steep and as far as I know there were no major algo updates at the time. I sent a reconsideration request expecting to receive an answer that I have no manual penalty. Instead, I received the following email: We received a request from a site owner to reconsider how we index the following site: http://www.sitename.com/. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help There have been no changes in my rankings. Does this reply mean that I have/had a manual penalty?
Technical SEO | | JillB20130 -
Will manually autotagging my adwors campaign work?
Good Morning from 17°C Mostly Cloudy Humidity: 85% Wetherby Uk 🙂 For reasons too arduous to detail autotagging my ppc ads is not working in that Google analytics ppc data is missing:-( To fix this I've gone down ther route of manually autotagging the my ppc campaigns by adding this onto the end of the campaign urls - www.aldermedia.co.uk/?utm_source=google&utm_medium=cpc&utm_term={keyword}&utm_campaign=Aldermedia My question "But will adding curly brackets around the work keyword as in {keyword} dynamically record the different keywords used and show in analytis?" Below is a screenshot of the situation: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/manually-autotagging-campaigns-screenshotcopy.jpg I have disabled autotagging in the ppc dashboard to allow for autotagiing to work but ultimately I want to see some ppc data in the campaigns section of Google analytics and I really want to know what paid terms visitors entered. Hope this all makes some sense, please take a look at the above screenshot to help. Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Ranking Implications due to Altering Page Names
I'm working on a very large (over 2500 pages), very old website (with pages created in the late 90s). The structure of the site is a mess (it still shows subtle sings of Frontpage!). We are trying to move to a more uniform, dynamic solution. My question is: what sort of implication will there be on search rankings if we implement 301 redirects from the very old pages (that are poorly named) to new pages which follow a uniform pattern. Some of these pages have external links pointing to them and others (most) just have internal links pointing to them which will be adjusted to the new urls. Does the age of the page have a significant implication on rankings? Is there a better way to this than 301 redirects? Thanks for the help
Technical SEO | | Bartell0