The Website Ahead Contains Malware! - How to deal with this ?
-
Hello,
I got in webmaster tools :
The Website Ahead Contains Malware!
When i go to the website i cant see the site and im getting a page that saying the same thing, a Malware! problem.
I search in Google information's about this and its look like someone hack to our website and install something in the code.Is any one have experience with this ?
How can i fix this ? its a very big website....Need help !!!!
Thank you
-
Something like that happened to me and, i installed the OSE Firewall http://wordpress.org/plugins/ose-firewall/
It worked better than Sucuri wich is payed, although in Sucuri they give help to you.
Also try to speak with your server, they can help you find strange files and do a check up on server and database.
-
There's another service, provided by McAfee that scans your site for possible vulnerabilities and even provides with a bath ONLY if your site is proven to be safe and you fix any issue they can find. Can't remember the price though, but I used it in the past and they scan the entire site daily, totally recommended.
-
Edmond - No, I didn't use their service, as it was quite expensive and I figured I'd be able to fix it myself in 2-3 hours of cleaning (which I was able to do). Let us know how the process goes...
-- Jeff
-
I notice that : http://sitecheck.sucuri.net/scanner/ have a service that can clean the site.
Did you use it ?
-
Hi Edmond,
First, I'd suggest you to re-upload all your files.
Then you can do some damage assessment and find out where is the hack and how they went through.
Let's start with that, re-upaload.
If your site uses a CMS, you should update the CMS right away, if it is wordpress, I'd also suggest to use a service like CloudFlare with a PRO account and make use of the WAF (web application firewall) to help prevent another infection while you are still fixing things. WAF offers dozens of fixes to wordpress loopholes that an attacker may use to gain access.
If your site doesn't use a CMS, you need to find out how they accessed your site, usual causes are mysql injection or a brute force attack to get FTP/DB access.
A Plus - Google Webmaster Help on Hacked sites:
-
Edmond -
I've had this happen in the past, and it can take a while to resolve.
First, go through the site and find out where all of the malware has been uploaded. It's possible it's in new files, or added to existing files. It could also be located in the database as well, if it's a database-driven site. I'd recommend going through EVERY file tree via FTP and look for odd files. Don't just rely on timestamps, though, as some malware can be uploaded without leaving a "recent" timestamp.
If you haven't changed anything recently, you may be able to just restore the site from an older backup. That said, sometimes the malware has been sitting on the server, and it hasn't been discovered until recently.
Your hosting company may be able to help.
Once the site is cleaned up, use a third party scanner to make sure it's clean. I've used this one in the past: http://sitecheck.sucuri.net/scanner/
Then, through Google Webmaster tools, request a new scan. This can take several long, long days.
While the site is down / inaccessible, consider posting to social media accounts like Facebook, Twitter, LinkedIn, Google +, etc to let people know that you are working on the issue and it should be resolved quickly.
And then wait... until the restriction is lifted...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating to a tag-driven global website - Need opinions!
We currently have a global site that is set up this way: Subfolders to designate countries. Content in same language is re-published on other country websites. Since we are re-launching at the end of the year, we are doing away with re-publishing content on different country sites and will just maintain a single copy of our content (to be populated on different pages using content tags). We are planning on doing this so that there is no need to apply href-lang tags on our content. My questions: Is maintaining just a single instance of an article good for a global website? What are the possible complications that may come up from this approach? Since there is only one version of the article that is being indexed, is a rel-canonical tag even needed? Should href-lang tag still be applied to high level pages (homepage, etc) to ensure that the correct homepage shows up in the appropriate geography? This question is quite long, so any feedback will be helpful. Thanks!
International SEO | | marshdigitalmarketing0 -
Hreflang on non 1:1 websites
Hi. I have a client with international websites targeting several different countries. Currently, the US (.com) website outranks the country-specific domain when conducting a search within that country (i.e. US outranks the UK website in the UK). This sounds like a classic case for hrelang. However, the websites are largely not 1:1. They offer different content with a different design and a different URL structure. Each country is on a country-specific domain (.com, .co.uk, .com.au, etc.). As well, the country-specific domains have lower domain authority than the US/.com website - fewer links, lower quality content, poorer UX, etc. Would hreflang still help in this scenario if we were to map it the closest possible matching page? Do the websites not sharing content 1:1 add any risks? The client is worried the US/.com website will lose ranking in the country but the country-specific domain won't gain that ranking. Thanks for any help or examples you can offer!
International SEO | | Matthew_Edgar0 -
Website relaunched: Both old pages and new pages indexed
Hi all, We have recently made major changes to our website and relaunched it. We have changed URLs of some pages. We have redirected old URLs to new before taking website live. When I check even after one week, still the same old and new pages also indexed at Google. I wonder why still old pages cache is there with Google. Please share your ideas on this. Thanks
International SEO | | vtmoz0 -
Best way to interlink 25 different language versions of a website?
I have a website which has 25 different language versions on 16 different domains. Hreflan are setup to point to different language versions. In the footer we have deeplinks to the 25 language versions. Site is not spammy but in small niche and many language versions have very few other external links. For some time this site had lost rankings for reasons that are unclear till now. I see that large international sites such as booking.com, tripadvisor, apple all use different approaches to interlink their language versions. Interestingly Tripadvisor is nowadays loading the links to their other language versions dynamically only upon click so that these links do not show up in source code, deviating from their former implementation of static deeplinks to all language versions. Matt Cutts mentioned back in 2013 “If you have 50 different sites, I wouldn’t link to all 50 sites down in the footer of your website, because that can start to look pretty spammy to users. Instead you might just link to no more than three or four or five down in the footer, that sort of thing, or have a link to a global page, and the global page can talk about all the different verions and country versions of your website.” But in their webmaster guidelines google recommends: "Consider cross-linking each language version of a page. That way, a French user who lands on the German version of your page can get to the right language version with a single click." I assume for SEO anyway these links have no value, but for user experience it would certainly be better to provide somewhere deeplinks to other language versions. Also the fact that language versions are on different domains and have few external backlinks may increase a bit the risk in our case. I guess in doubt I would prefer to be safe and load deeplinks only upon click same as tripadvisor. Any thoughts/suggestions on best interlinking in our specific case?
International SEO | | lcourse0 -
Multilingual website - Url problem (sitemap)
At this moment our website both uses the language in the url like "en" and localizes the url itself ("books" in english and "boeken" in dutch). Because of the history of making our website multilingual we have a system that takes the browser language for the localization if the url doesn't contain a language like "en". This means: www.test.com/books = browser language www.test.com/en/books = english language www.test.com/boeken = browser language www.test.com/nl/boeken = dutch language Now for the sitemap this makes it a little troublesome for me because which hreflang is used for which url? 1) The first thing I thought of was using x-default for all urls that get the language of the browser. <code><url><loc>http://www.test.com/books</loc></url></code> But as you can see we now got 2 times x-default. 2) Another solution I thought of was just use the localization of the url to determine the language like: <code><url><loc>http://www.test.com/books</loc></url></code> But now we got 2 of each language for the same page. 3) The last solution I thought of was removing links without a language in the url (except for the homepage, which will still have an x-default) like: <code><url><loc>http://www.test.comen/books</loc></url></code> But for this solution I need to put 301's at pages that are "deleted" and also need to change the system to 301 to the right page. Although the last point isn't really a problem I'm kind of worried that I will lose some of the "seo points" with a 301. (When we changed our domain in the past we had a bad experience with the 301 of our old domain) What do you think would be the best solution for SEO? Or do you have any other suggestions or solutions I haven't thought of.
International SEO | | Anycoin0 -
Two versions of a website with different languages - Best way to do it?
I'm working on a website for a Swedish artist and her page is in Swedish, everything is in Swedish on the site, even though it's not a lot of text on the site. We would like to have the site in English too, or another version of the site in English on a separate domain, what's the best way to proceed from here? The domain name is a .se (swedish domain), would it be better to create a another domain and host the english version of the site on a .com domain? Or will we bump into problems with duplicate content if we create a replica of the swedish site in english. We're using wordpress and I know that there's translation plugins out there, is that a good option? I'm a bit clueless on how to proceed and would love some help or guidance here.
International SEO | | Fisken0 -
International websites : hreflang
Hi, i'm looking for good examples with 'href lang' tag (rel="alternate" hreflang="x") Have you examples of websites with this tag? Thanks D.
International SEO | | android_lyon0 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0