Cloaking/Malicious Code
-
Does anybody have any experience with software for identifying this sort of thing?
I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code.
Thanks everybody!
-
Damn..... that is a HOT idea.
I feel like a detective!!!!!!!!!!
-
Great, good luck with things. You might be able to use the time stamps on the files in conjunction with the server logs to determine when the modifications were made and how they were made.
-
Thanks!
I actually came across sucuri.net the other day in my own search. I wasn't sure what people's opinions were.
According to the team we are working with the malicious files are Index.php and Hello.php
Thanks again! I'm looking into it now!
-
If you are thinking your site has been compromised what I always use to check a site is https://sucuri.net/ I would advise you to change all logins and passwords as well as update any cms you are using to the latest stable version as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
IS http://ezinearticles.com/ good or bad for backlinks?
Hi Everyone, Is http://ezinearticles.com/ any good to use? Thanks
White Hat / Black Hat SEO | | vanplus0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Attracta.com / "weekly submissions to top 100 search engines"
I recently received an offer from Attracta.com because I have a hostgator account. They are offering different levels of service for submitting xml sitemaps on a weekly basis. Is this a good idea? Thanks for your feedback! Will PS see graphic: Screen%20Shot%202012-02-08%20at%2010.06.56%20PM.png
White Hat / Black Hat SEO | | WillWatrous0 -
Improve CTR with Special Characters in Meta-Description / Title Tags
I've seen this question asked a few times, but I haven't found a definitive answer. I'm quite surprised no one from Google has addressed the question specifically. I ran across this post the other day and it piqued my interest: http://www.datadial.net/blog/index.php/2011/04/13/special-characters-in-meta-descriptions-the-beboisation-of-google/ If you're able to make your result stand out by using stars, smiley faces, TM symbols, etc it would be a big advantage. This is in use currently if you search for a popular mattress keyword in Google. It really is amazing how the special characters draw your attention to the title. You can also see the TM and Copyright symbols if you search for "Logitech Revue" Radioshack is using these characters in their adwords also. Has anyone found any definitive answers to this? Has anyone tracked CTR and long-term results with special characters in title or description tags? Any chance of getting penalized for using this? As a follow-up, it looks like you could also put check symbols into your meta-description tags. That has all kinds of interesting possibilities. http://www.seosmarty.com/special-symbols-wingdings-for-social-media-branding-twitter-linkedin-google-plus/
White Hat / Black Hat SEO | | inhouseninja0