Articles marked with "This site may be hacked," but I have no security issues in the search console. What do I do?
-
There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP.
I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place.
The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored.
What other steps should I take?
One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/
-
Thanks, Paul. We started resubmitting the cleaned pages yesterday. I passed your comments about the Apache install and the old version of PHP to the devs as well.
At the very least, this is a great learning experience for us. It's great to have such a helpful community.
-
It looks like the devs have cleaned up most of the obvious stuff, Matthew, so I'd get to work resubmitting the pages that were marked as hacked but now longer show that issue.
Do make sure the devs keep working on finding and cleaning up attack vectors (or just bite the bullet and pay for a year of Sucuri cleanup and protection) but it's important to get those marked pages discovered as clean before too much longer.
Also of note - your site's server's Apache install is quite a bit out of date and you're running a very old version of PHP as well that hasn't been getting even security updates for over a year. Those potential attack vectors need to be addressed right away too.
Good luck getting back into Big G's good graces!
Paul
P.S. Easy way to find the pages marked as hacked for checking/resubmission is a "site:" search e.g. enter **site:brolik.com **into a Google search.
P.P.S. Also noted that you have many pages from brolik-temp.com also still indexed. The domain name just expired yesterday, but the indexed pages showed a 302-redirect to the main domain, according to the Wayback Machine. These should be 301s in order to help get the pages to eventually drop out of the SERPS. (And with 301s in place, you could either submit a "Change of Address" for that domain in Webmaster Tools/GSC or you do a full removal request. Either way, I wouldn't want those test domain pages to remain in the indexes.
-
Thank you, Paul. That was going to be my next question: what to do when the blog is clean.
Unfortunately, the dev's are still frantically pouring through code hunting for the problem. Hopefully they find it soon.
-
Just a heads-up that you'll want to get this cleaned up as quickly as possible, Matthew. Time really is of the essence here.
Once this issue is recognised by the crawler as being widespread enough to trigger a warning in GSC, it can take MONTHS to get the hacked warning removed from the SERPS after cleanup.
Get the hack cleaned up, then immediately start submitting the main pages of the site back to Fetch as Google tool to get them recrawled and detected as clean.
I recently went through a very similar situation with a client and was able to get the hacked notification removed for most URLs within 3 and 4 days of cleanup.
Paul
-
Passed it on to the dev. Thanks for the response.
I'll let you know if they run into any trouble cleaning it up.
-
It is hacked, you just have to look at the page as Googlebot. Sadly, I have seen this before.
If you set your user agent as Googlebot - you will see a different page (see attached images). Note that the Title, H1 tags and content are updated to show info on how to Buy Zithromax. This is a JS insertion hack where when the user agent is shown as Googlebot they overwrite your content and insert links to pages to help gain links. This is very black hat and bad and yes scary. (See attached images below)
I use "User Agent Switcher" on FF to set my user agent - there are lots of other tools for FF and Chrome to do this. You can also run a spider on your site such as screaming frog and set the user agent to Googlebot and you will see all the changed H1s and title tags,
It is clever as "humans" will not see this, but the bots will so it is hard to detect. Also, if you have multiple servers, you may only have 1 of the servers impacted and so you may not see this each time depending on what server your load balancer is sending you to. You may want to use Fetch as Google in Webmaster console and see what Google sees.
This is very serious, show this to your dev and get it fixed ASAP. You can PM me if you need more information etc.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use the Change of Address in Search Console when moving subdomains to subfolders?
We have several subdomains for various markets for our business. We are in the process of moving those subdomains to subfolders on the main site. Example: boston.example.com will become example.com/boston And seattle.example.com will become example.com/seattle and so on. It's not truly a change of address, but should I use the change of address tool in GSC for all of these subdomains moving?
Intermediate & Advanced SEO | | MJTrevens0 -
Does alt tag optimization benefit search rankings (not image search) at all?
The benefits of alt tag optimization for traditional SEO has always been a "yo yo" subject for me. Way back in the day (2004 to 2007) I believed there was some benefit to alt tag SEO. However as time went on I saw evidence that the major search engines were no longer considering alt tag SEO as a ranking signal. However I later had the pleasure to work on a joint project with a high end SEO firm in 2011/2012. My colleagues fully believed that alt tag optimization was still a very important strategy for traditional SEO at that time. Is there any evidence available that alt tags still help with traditional SEO nowadays? I'm fully aware of the benefits of optimized alt tags and image search. However could optimized alt tags be one of those ranking factors that Google removed due to abuse and later quietly resurrected?
Intermediate & Advanced SEO | | RosemaryB0 -
Site migration from non canonicalized site
Hi Mozzers - I'm working on a site migration from a non-canonicalized site - I am wondering about the best way to deal with that - should I ask them to canonicalize prior to migration? Many thanks.
Intermediate & Advanced SEO | | McTaggart0 -
Severe health issues are found on your site. - Check site health (GWT)
Hi, We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message: Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
Intermediate & Advanced SEO | | bjs2010
</a>Is robots.txt blocking important pages? Some important page is blocked by robots.txt. Now, this is the weird part - the page being blocked is the admin page of magento - under
www.domain.com/index.php/admin/etc..... Now, this message just wont go away - its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ?? Any ideas? THanks0 -
Is it better "nofollow" or "follow" links to external social pages?
Hello, I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+). if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page: http://www.virtualsheetmusic.com/ Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do? Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that. Any suggestions are very welcome. Thank you in advance!
Intermediate & Advanced SEO | | fablau9 -
How to identify 404 that get links from external sites (but not search engines)?
one of our site had a poor site architecture causing now about 10.000s of 404 being currently reported in google webmaster tools. Any idea about easily detecting among these thousands of 404, which ones are coming from links from external websites (so filtering out 404 caused by links from our own domain and 404 from search engines)? crawl bandwidth seems to be an issue on this domain. Anything that can be done to accelerate google removing these 404 pages from their index? Due to number of 404 manual submission in google wbt one by one is not an option.
Intermediate & Advanced SEO | | lcourse
Or do you believe that google automatically will stop crawling these 404 pages within a month or so and no action needs to be taken? thanks0 -
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing. Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing under same keywords " parts washers" Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
Intermediate & Advanced SEO | | mhart0 -
Should "View All Products" be the canonical page?
We currently have "view 12" as the default setting when someone arrives to www.mysite.com/subcategory-page.aspx. We have been advised to change the default to "view all products" and make that the canonical page to ensure all of our products get indexed. My concern is that doing this will increase the page load time and possibly hurt rankings. Does it make sense to change all our our subcategory pages to show all the products when someone visits the page? Most sites seem to have a smaller number of products as the default.
Intermediate & Advanced SEO | | pbhatt0