Familiar with the malware reinclusion process?
-
One of our sites was haXX0red and at the moment I'm thinking it was a non-updated paid for WP plugin using the old version of timthumb.
While not important to my question, the hack included .htaccess files in all the /uploads/ to redirect to a site (tonycar dot com) which I assume installed some sort of malware or spyware.
I changed all ftp and admin log ins, updated the timthumb files and deleted all the .htaccess files, for added measure I've currently made the upload folders read only.
I've requested a review through webmaster tools and the image that WMT claimed to be an issue has been removed as being an issue. That is to say if I clicked on the malware warning in WMT, it told me imagex.jpg was a problem and now it doesn't tell me anything is an issue, though the malware warning still persists.
As I no longer have any indication as to what (if anything) is wrong, I tried going through some contacts at adwords to no avail, though they have said there's a note saying there's no malware currently on the site (I'm hoping that's by them and not just my reinclusion request).
Assuming the all mighty G is now satisfied there's no malware on the site (or being processed by the site), does anyone have any idea how to get rid of the warning?
Alternatively if the warning is accurate, how can I find out what's being effected?
-
It's a waiting game at this point. If they don't find problems then ask for reinclusion again. Wait 24 hours between asking for reinclusion & seeing if Google reports new problems.
-
If Google's stopped telling me what the problem files are, any idea how to find out what they are seeing?
I think I've plugged the problem and removed the suspicious files, but I can't really be sure.
-
I ran into an issue with malware once and Google was very responsive during the process. Each time I asked for reinclusion the request was responded to within 24 hours.
I say "each time" because this particular piece of malware infected random files across an entire dedicated server hosting a great deal of websites. After I became aware that the problem was impossible to solve manually, I wrote a script to detect and remove all traces of the malware. At this point it was my 5th request I believe, and there was no problem with Google approving my request.
There are scanners you can use but during my look at them, I didn't find any reliable free ones. Hopefully you got it all and won't need to pay for anything.
Wonderful people, these malware creators. Best of luck.
-
It should go away on it's own once you removed all the offending malware code from your site.
Call your hosting company and they will scan your site and remove the malware for you. A lof of people don't know that their hosting company will be more than happy in assisting removing hacks or viruses present on your sites at no charge. It's probably still on your site if you're still getting the message days later.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client suffered a malware attack. Removed links not being crawled by Google!
Hi all, My client suffered a malware attack a few weeks ago where an external site somehow created 700 plus links on my clients site with their content. I removed all of the content and redirected the pages to the home page. I then created a new temporary xml sitemap with those 700 links and submitted the sitemap to Google 9 days ago. Google has crawled the sitemap a few times but not the individual links. When I click on the crawl report for the sitemap in GSC, I see that the individual links still have the last crawled date from before they were removed. So in Googles eyes, that old malicioud content still exists. What do I do to ensure Google knows the contnt is gone and redirected? Thanks!
Technical SEO | | sk19900 -
GSC: Change of Domain Not Processed, Despite Saying "Approved"?
Hi folks, I've just completed a straightforward olddomain -> newdomain migration. All the redirects were done on 7th Feb. I submitted the change of domain request on 7th Feb. All seemed fine - as can be seen in the attached. It's now 19th March and our pals at GSC are still saying that the domain migration is ongoing. I've never had this take so long before; 2-3 days tops. Their results are tanking as I can't geo target and more features in GSC are out of action as it's 'locked' due to this migration (I just get a screen as per the attached). Thoughts? Shall I risk withdrawing the request and starting anew? The old "turn it off and on again"? Thanks! hJXKC
Technical SEO | | tonyatfat0 -
What is the process for allowing someone to publish a blog post on another site? (duplicate content issue?)
I have a client who allowed a related business to use a blog post from my clients site and reposted to the related businesses site. The problem is the post was copied word for word. There is an introduction and a link back to the website but not to the post itself. I now manage the related business as well. So I have creative control over both websites as well as SEO duties. What is the best practice for this type of blog post syndication? Can the content appear on both sites?
Technical SEO | | donsilvernail0 -
Best SEO service/process to harness the power of quality backlinks?
What/who would you recommend for those looking for a strategy around realizing the benefits of high quality back links? We have tons of earned links from DA 90+ sites, but don't think we are realizing the full benefit due to onsite issues. We have scraper sites outranking us. Would it be a technical on page audit? Any guidance appreciated.
Technical SEO | | loveit0 -
404 or 503 Malware Content ?
Hi Folks When it comes to malware , if I have a site that uses iframe to show content off 3rd party sites which at times gets infected. Would you recommend 404 or 503 ing those pages with the iframe till the issue is resolved ? ( I am inclined to use 503 .. ) Then take the 404/503 off and ask for a reindex ( from GWT malware section ) OR Ask for a reindex as soon as the 404/503 goes up. ( I do understand we are asking to index as non existing page , but the malware warning gets removed ) PS : it makes sense for this business to showcase content using iframe on these special pages . I do understand these are not the best way to go about SEO.
Technical SEO | | Saijo.George0 -
Has anyone seen direct improvement after April 23 by requesting reinclusion?
Using the open site explorer I have figured out that my former seo agency was buying name spam (mostly Asian sites)for my main keywords and did the same in a private network of blogs. I don't speak any eastern languages and seo Super Dude has left the planet. So... I don't really have much to report to the Google Webmaster folks. How much time - effort- cash do invest in removal requests vs, redo the whole darn site and hope for the best? All the best. Tom
Technical SEO | | tvw1300 -
Automate process for naming page titles?
Hi everyone, I'm new to the Moz community, but really loving it. I'm hoping some of you more experienced experts may be able to answer what is probably a pretty basic question. I'm working for a non-profit client and I used the SEOMOZ tool to run a report on their site errors. I learned that the website has 5,432 duplicate page titles. They had their website redesigned last year before I started working with them and it appears that the developers didn't take into a account the need to provide unique page title names to each page. While it may make sense to go in to add custom page titles to a handful of the pages for the site, the vast majority of the pages are allocated to products (e-commerce). Is there a script that the developers could add that would automatically add page unique page titles based on say, the title of the product? Here are a few URLs to help you get a sense of what we're dealing with. homepage: www.creativityexplored.org example level two page: http://www.creativityexplored.org/artists/douglas-sheran example product page: http://www.creativityexplored.org/shop/original-art/prints/2776/profile-of-a-lady Thank you so much for any advice you can offer. Best, Linda
Technical SEO | | LindaSchumacher0 -
How to automate the process of checking the operators
How to automate the process of checking the operators of search teams.
Technical SEO | | meteorr
such as:
inurl:? lang = ru site: tochka.net
inurl: print site: tochka.net
inurl: print site: tochka.net / *
inurl: nomobile = 1 site: tochka.net / *
inurl: comments site: tochka.net
inurl: comments site: tochka.net / *
inurl:? a_aid site: tochka.net ... with the conclusion of the number of pages in the search. There is a program to identify?0