Cloaking/Malicious Code
-
Does anybody have any experience with software for identifying this sort of thing?
I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code.
Thanks everybody!
-
Damn..... that is a HOT idea.
I feel like a detective!!!!!!!!!!
-
Great, good luck with things. You might be able to use the time stamps on the files in conjunction with the server logs to determine when the modifications were made and how they were made.
-
Thanks!
I actually came across sucuri.net the other day in my own search. I wasn't sure what people's opinions were.
According to the team we are working with the malicious files are Index.php and Hello.php
Thanks again! I'm looking into it now!
-
If you are thinking your site has been compromised what I always use to check a site is https://sucuri.net/ I would advise you to change all logins and passwords as well as update any cms you are using to the latest stable version as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Malicious bots
I was looking at some recommended keywords and felt sick to my stomach when I saw ilovevitaly.com search shell, resellerclub scam and a few more. | 2. | | 28(2.29%)ilovevitaly.com search shell | 0.00% | 0(0.00%) | 42.86% | 1.75 | 00:10:13 | 0.00% | 0(0.00%) | $0.00(0.00%) |
White Hat / Black Hat SEO | | BlueprintMarketing
| | 3. | resellerclub scam | I believe I have found the multiple IP addresses in which they're coming from and when I say many I mean I found 200 or so. There from different C blocks so they're very difficult to block easily without blocking legitimate traffic. I'm using a couple of different web application firewalls with the ability to block it pretty much anything. Does anyone have any device on doing this in a manner that might be more efficient than what I'm doing.I definitely do not want Google to think this is something that I did and penalize somebody this would be horrible. The site is going through Sucuri.net to be cleaned of any possible infection right now I do not know how this happened but zero day attacks are unfortunately a very real reality and unfortunately it could've been 1 million things. Thanks a million guys. I appreciate your help,
Tom0 -
I need de-spam help/advice
For one of my sites I am working on I outsourced SEO about 3 years ago. One of the "tricks" the SEO used at the time was to pay for several Blog posts to be "sponsored" by this web site using exact match keywords for the domain. 1 Where do I look to determine the spammy links pointing to this site? 2 Have you had success getting rid of these bad links?
White Hat / Black Hat SEO | | kadesmith0 -
Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
White Hat / Black Hat SEO | | GroupeDSI0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090