Articles marked with "This site may be hacked," but I have no security issues in the search console. What do I do?
-
There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP.
I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place.
The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored.
What other steps should I take?
One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/
-
Thanks, Paul. We started resubmitting the cleaned pages yesterday. I passed your comments about the Apache install and the old version of PHP to the devs as well.
At the very least, this is a great learning experience for us. It's great to have such a helpful community.
-
It looks like the devs have cleaned up most of the obvious stuff, Matthew, so I'd get to work resubmitting the pages that were marked as hacked but now longer show that issue.
Do make sure the devs keep working on finding and cleaning up attack vectors (or just bite the bullet and pay for a year of Sucuri cleanup and protection) but it's important to get those marked pages discovered as clean before too much longer.
Also of note - your site's server's Apache install is quite a bit out of date and you're running a very old version of PHP as well that hasn't been getting even security updates for over a year. Those potential attack vectors need to be addressed right away too.
Good luck getting back into Big G's good graces!
Paul
P.S. Easy way to find the pages marked as hacked for checking/resubmission is a "site:" search e.g. enter **site:brolik.com **into a Google search.
P.P.S. Also noted that you have many pages from brolik-temp.com also still indexed. The domain name just expired yesterday, but the indexed pages showed a 302-redirect to the main domain, according to the Wayback Machine. These should be 301s in order to help get the pages to eventually drop out of the SERPS. (And with 301s in place, you could either submit a "Change of Address" for that domain in Webmaster Tools/GSC or you do a full removal request. Either way, I wouldn't want those test domain pages to remain in the indexes.
-
Thank you, Paul. That was going to be my next question: what to do when the blog is clean.
Unfortunately, the dev's are still frantically pouring through code hunting for the problem. Hopefully they find it soon.
-
Just a heads-up that you'll want to get this cleaned up as quickly as possible, Matthew. Time really is of the essence here.
Once this issue is recognised by the crawler as being widespread enough to trigger a warning in GSC, it can take MONTHS to get the hacked warning removed from the SERPS after cleanup.
Get the hack cleaned up, then immediately start submitting the main pages of the site back to Fetch as Google tool to get them recrawled and detected as clean.
I recently went through a very similar situation with a client and was able to get the hacked notification removed for most URLs within 3 and 4 days of cleanup.
Paul
-
Passed it on to the dev. Thanks for the response.
I'll let you know if they run into any trouble cleaning it up.
-
It is hacked, you just have to look at the page as Googlebot. Sadly, I have seen this before.
If you set your user agent as Googlebot - you will see a different page (see attached images). Note that the Title, H1 tags and content are updated to show info on how to Buy Zithromax. This is a JS insertion hack where when the user agent is shown as Googlebot they overwrite your content and insert links to pages to help gain links. This is very black hat and bad and yes scary. (See attached images below)
I use "User Agent Switcher" on FF to set my user agent - there are lots of other tools for FF and Chrome to do this. You can also run a spider on your site such as screaming frog and set the user agent to Googlebot and you will see all the changed H1s and title tags,
It is clever as "humans" will not see this, but the bots will so it is hard to detect. Also, if you have multiple servers, you may only have 1 of the servers impacted and so you may not see this each time depending on what server your load balancer is sending you to. You may want to use Fetch as Google in Webmaster console and see what Google sees.
This is very serious, show this to your dev and get it fixed ASAP. You can PM me if you need more information etc.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
SEO on Jobs sites: how to deal with expired listings with "Google for Jobs" around
Dear community, When dealing with expired job offers on jobs sites from a SEO perspective, most practitioners recommend to implement 301 redirects to category pages in order to keep the positive ranking signals of incoming links. Is it necessary to rethink this recommendation with "Google for Jobs" is around? Google's recommendations on how to handle expired job postings does not include 301 redirects. "To remove a job posting that is no longer available: Remove the job posting from your sitemap. Do one of the following: Note: Do NOT just add a message to the page indicating that the job has expired without also doing one of the following actions to remove the job posting from your sitemap. Remove the JobPosting markup from the page. Remove the page entirely (so that requesting it returns a 404 status code). Add a noindex meta tag to the page." Will implementing 301 redirects the chances to appear in "Google for Jobs"? What do you think?
Intermediate & Advanced SEO | | grnjbs07175 -
Syntax: 'canonical' vs "canonical" (Apostrophes or Quotes) does it matter?
I have been working on a site and through all the tools (Screaming Frog & Moz Bar) I've used it recognizes the canonical, but does Google? This is the only site I've worked on that has apostrophes. rel='canonical' href='https://www.example.com'/> It's apostrophes vs quotes. Could this error in syntax be causing the canonical not to be recognized? rel="canonical"href="https://www.example.com"/>
Intermediate & Advanced SEO | | ccox10 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Consolidate Local sites to one larger site
I am a partner in a real estate company that operates in 10 different markets across the country. Each of these markets has it's own individual domain. My question is should we consolidate each of these markets into one domain that services all markets? What would we possibly gain or lose from an organic traffic standpoint? In some of our more established markets (Indianapolis, Las Vegas, Tampa, Orlando and Charlotte) our organic traffic accounts for 50-60% of our total traffic. In some of our newer markets (Denver, Phoenix, San Diego) it accounts for less than 15%. We do operate under two different brand names. EasyStreet Realty and Highgarden Real Estate. EasyStreet has been around since 2000 with most of our Highgarden sites only up for 6-24 months. Another question is we are considering converting all EasyStreet divisions to Highgarden. I am a little reluctant to do so, since most of our organic traffic is coming from our EasyStreet sites. Thoughts? You can find links to all our sites at www.easystreetrealty.com or www.highgarden.com Thank you in advance for your insight.
Intermediate & Advanced SEO | | EasyStreet0 -
Wise or cluttery for a website? Should our "out of the mainstream" of popular products be listed on our site? (older/discontinued, umfamiliar brands, parts to products, etc...)
For instance, should we list replacement parts for a music stand? Or parts for a trumpet, like a valve button? To some, this seems like a cluttery thing to do. I suppose another way to ask would be, "Should we only list the high quantity selling items that are well branded and that everyone shops for, and leave the rest off the website for instore customers only to buy?" (FYI: Our website focus is for our local market mainly, and we're not trying to take on the world per-say, but if the world wants in, that's cool too.) (My thought here is that if a customer walks into our retail store and they request an odd ball part or item... we go hunting for it and find it for them. Or perhaps another Music Store needs a part? To me, it's ALL for sale,... right? Our retail depth, should be reflected in our online presence as much as possible,... correct? I'd personally choose to list the odd balls on our site, just as if a customer was standing in the store. Another side thought is, if we only list the main stream products... we are basically lessening our content (which could affect our rankings) and would be inviting ourselves into a higher competitive market place because we wouldn't be saying anything different than what most other music store sites out there say. I believe we need to show off our uniqueness,... and product depth (of course w/good SEO & content too) is really kinda it, aside of course also from good expert people and a large facility. But perhaps that's a wrong way to look at it?) Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Any problems with two sites by same owner targeting same keyword search?
I have a site, let's call it ExcellentFreeWidgets.com. There is a page on the site that is very popular and we'll call the page title, "Big Blue Widget." That page is currently #1 for the search "big blue widget." This week, I was able to buy the exact match domain for that page, we'll call it BigBlueWidget.com. I want to build a site on BigBlueWidget.com to better capitalize on that search "big blue widget," which is huge. The content would not be the same wording at all, but it would be the same subject. It would probably be a five page or so website, all about Big Blue Widgets: what they are, where to get them, etc. The sites will not reciprocally link to each other. New new site, BigBlueWidgets.com, would link to the existing site, ExcellentFreeWidgets.com. The new site and the current page will compete for position in the SERPs. Here are my questions to you experts: 1. Will Google care at all that the same entity owns both sites, or will just just rank for the term as they normally would. 2. I am not sure I'll run Adsense on the new site or not. I will be pointing a link back my ExcellentWidgets.com site from a button that says, "Get an Excellent Widget." But if I do run Adsense on it, does Google Adsense care that the same entity has a site and another site's page that are competing for the same term that both have Adsense add on them? Note: I do not want to start a new entity for the new site (I'm in CA and LLC's are $800/year) as it's probably not worth all that hassle and money. Thank you so much. I hope the that obfuscating the real domain names did not confuse the issue too much.
Intermediate & Advanced SEO | | bizzer0