Cloaking/Malicious Code
-
Does anybody have any experience with software for identifying this sort of thing?
I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code.
Thanks everybody!
-
Damn..... that is a HOT idea.
I feel like a detective!!!!!!!!!!
-
Great, good luck with things. You might be able to use the time stamps on the files in conjunction with the server logs to determine when the modifications were made and how they were made.
-
Thanks!
I actually came across sucuri.net the other day in my own search. I wasn't sure what people's opinions were.
According to the team we are working with the malicious files are Index.php and Hello.php
Thanks again! I'm looking into it now!
-
If you are thinking your site has been compromised what I always use to check a site is https://sucuri.net/ I would advise you to change all logins and passwords as well as update any cms you are using to the latest stable version as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paid Link/Doorway Disavow - disavowing the links between 2 sites in the same company.
Hello, Three of our client's sites are having difficulty because of past doorway/paid link activity, which we're doing the final cleanup on with a disavow. There are links between the sites. Should we disavow all the links between the sites? Thank you.
White Hat / Black Hat SEO | | BobGW0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
The purpose of these Algo updates: To more harshly push eCommerce sites toward PPC and enable normal blogs/forums toward reclaiming organic search positions?
Hi everyone, This is my first post here, and absolutely loving the site and the services. Just a quick background, I have dabbled in SEO in the past, and have been reading up over the last few months and am amazed at the speed at which things are changing. I currently have a few clients that I am doing some SEO work for 2 of them, and have had an ecommerce site enquire about SEO services. They are a medium sized oak furniture ecommerce site. From all the major changes..the devaluing of spam links, link networks, penalization of overuse of exact match anchor text and the overall encouraging of earned links (often via content marketing) over built links, adding to this the (not provided) section in Google Analytics, and the increasing screen real estate that PPC is getting over organic search...all points to me thinking on major thing..... That the search engine is trying to push eCommerce sites and sites that sell stuff harder toward using PPC and paid advertising and allowing the blogs/forums and informational sites to more easily reclaim the organic part of the search results again. The above is elaborated on a bit more below.. POINT 1 Firstly as built links (article submission, press releases, info graphic submission, web 2.0 link building ect) rapidly lose their effectiveness, and as Google starts to place more emphasis on sites earning links instead - by producing amazing interesting and unique content that people want to link to. The fact remains that surely Google is aware that it is much harder for eCommerce sites to produce a constant stream of interesting link worthy content around their niche (especially if its a niche that not an awful lot could be written about). Although earning links is not impossible for eCommerce sites, for a lot of them it is more difficult because creating link worthy content is not what eCommerce sites were originally intended for. Whereas standard blogs and forums were built for that exact purpose. Therefore the search engines must know that it is a lot easier for normal blogs/forums to "earn" links through content, therefore leading to them reclaiming more of the organic search ranking for transaction and non transaction terms, and therefore forcing the eCommerce sites to adopt PPC more heavily. POINT 2 If we add to the mix the fact that for the terms most relevant to eCommerce sites, the search engine results page has a larger allocation of PPC ads than organic results (above the fold), and that Google has limited the amount of data that sites can see in terms of which keywords people are using to arrive on their sites, which effects eCommerce sites more - as it makes it harder for them to see which keywords are resulting in sales. Then this provides further evidence that Google is trying to back eCommerce sites into a corner by making it more difficult for them to make sense of and track sales from organic results in comparison to with PPC, where data is still plentiful. Conclusion Are the above just over exaggerations? Can most eCommerce sites still keep achieving a good percentage of sales from organic search despite the above? if so, what do the more niche eCommerce sites do to "earn" links when content topics are thin and unique outreach destinations can be exhausted quickly. Do they accept the fact that the are in the business of selling things, so should be paying for their traffic as opposed to normal blogs/forums which are not. Or is there still a place for them to get even more creative with content and acquire earned links..? And finally, is the concentration on earned links more overplayed than it actually is? Id really appreciate your thoughts on this..
White Hat / Black Hat SEO | | sanj50500 -
Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
White Hat / Black Hat SEO | | GroupeDSI0 -
Attracta.com / "weekly submissions to top 100 search engines"
I recently received an offer from Attracta.com because I have a hostgator account. They are offering different levels of service for submitting xml sitemaps on a weekly basis. Is this a good idea? Thanks for your feedback! Will PS see graphic: Screen%20Shot%202012-02-08%20at%2010.06.56%20PM.png
White Hat / Black Hat SEO | | WillWatrous0 -
Are paid reviews gray/black hat?
Are sites like ReviewMe or PayPerPost white hat? Are follow links allowed within the post? Should I use those aforementioned services, or cold contact high authority sites within my niche?
White Hat / Black Hat SEO | | 10JQKAs0 -
Which of these elements are good / bad link building practices?
Hi, I need some help. I recently got some help with an seo project from a contractor. He did 50 directory submissions and 50 article submissions. I got good results, going up about 20 places (still a long way to the first page!) on google.co.uk on a tough key word Since this project I learned article marketing is not cool. So I am wondering about what I should do next. The contractor has proposed a new bigger project consisting of the elements listed below. I don’t know which of these elements are ok and which aren’t. If they are not ok are they: 1) a waste of time or 2) something I could get penalized for? Let me know what you think?? Thanks, Andrew 100 ARTICLE SUBMISSIONS [APPROVED ARTICLES] -> 1 article submitted to 100 article directories 50 PRESS RELEASE SUBMISSIONS [APPROVED & SCREENSHOTS]-> 1 PR writing & submissions to top 50 PR distribution sites each 150 PRIVATE BLOGS SUBMISSION [APPROVED ARTICLES] -> 1 article submitted to 150 private blogs submission 100 WEBSITE DIRECTORY SUBMISSION -> 1 url (home page) submitted to 100 top free web directories 50 SOCIAL BOOKMARKING [CONFIRMED LINKS] -> 1 url of site submitted to top 50 social bookmarking websites 40 PROFILE BACK-LINKS [CONFIRMED LINKS] -> 1-3 url's of site submitted and create 40 profile websites 50 SEARCH ENGINES -> submission to all the major search engines 20 NEWS WEBSITES -> Ping all links from reports to news websites
White Hat / Black Hat SEO | | fleurya0