How To Cleanup the Google Index After a Website Has Been HACKED
-
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map.
^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
-
As has been suggested you can request the removal of pages in GWMT and you should keep any wordpress site and plugins up to date.
To add to this, you might want to look at something like Cloudflare as an extra layer to protect your clients site. We've been using it for a year now and its made a massive difference, both to performance and security.
-
Doesn't have to be so tedious. If you have a list of URLs you can use the bulk removal extension for Chrome found here: https://github.com/noitcudni/google-webmaster-tools-bulk-url-removal
-
If you submitted a new sitemap then you should be fine, but "in case of emergency", you can try listing the individual URL's in Google Webmaster Tools > Google Index > Remove URL's. It's a tedious process, but something else you can try.
Sucks, doesn't it? Wordfence or another WP security plugin is well worth the time to install and monitor. Good luck.
-
Hi,
Just to let you know it's quite easy to find the domain of the site in question from the screenshot you provided
I think as you've created a new sitemap and submitted it to Google and removed all the pages so they return 404s, you should be fine. The page will just drop out of the index in time. You might also want to add the address of your XML sitemap to your robots.txt file as it's not there at present.
I crawled the site using Screaming Frog and couldn't see any dodgy looking pages remaining. You could create removal requests of the pages in Webmaster Tools if you'd like to take an extra step. Do this under Google Index > Remove URLs. To prevent this happening again, make sure you keep your WordPress installation and all plugins up to date.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website blog is hacked. Whats the best practice to remove bad urls
Hello So our site was hacked which created a few thousand spam URLs on our domain. We fixed the issue and changed all the spam urls now return 404. Google index shows a couple of thousand bad URLs. My question is- What's the fastest way to remove the URLs from google index. I created a site map with sof the bad urls and submitted to Google. I am hoping google will index them as they are in the sitemap and remove from the index, as they return 404. Any tools to get a full list of google index? ( search console downloads are limited to 1000 urls). A Moz site crawl gives larger list which includes URLs not in Google index too. Looking for a tool that can download results from a site: search. Any way to remove the URLs from the index in bulk? Removing them one by one will take forever. Any help or insight would be very appreciated.
Technical SEO | | ajiabs1 -
Can redirect URL website also shown on the google ranking? and higher than the original website?
can redirect URL website also shown on the google ranking? and higher than the original website? For example, I create URL B which redirect to website A, and do good SEO on URL B, can URL B rank higher than my original website A?
Technical SEO | | HealthmateForever0 -
Website Down
Hello guys, My website hasn't been reachable for couple of hours today and I can't really understand why as no links have been built, all the best practices have been followed regarding on page optimization. I also checked google webmaster tools and there are no warning messages, crawl problems or anything so I don't understand why this has happened. Now for some reason the website is up and running again.
Technical SEO | | PremioOscar1 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Website Migration - Very Technical Google "Index" Question
This is my understanding of how Google's search works, and I am unsure about one thing in specifc: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" connects to the "page directory". I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I ask is I am starting to work with a client who has a newly developed website. The old website domain and files were located on a GoDaddy account. The new websites files have completely changed location and are now hosted on a separate GoDaddy account, but the domain has remained in the same account. The client has setup domain forwarding/masking to access the files on the separate account. From what I've researched domain masking and SEO don't get along very well. Not only can you not link to specific pages, but if my above assumption is true wouldn't Google have a hard time crawling and storing each page in the cache?
Technical SEO | | reidsteven750 -
Lots of Pages Dropped Out of Google's Index?
Until yesterday, my website had about 1200 pages indexed in Google. I did lots of changes: removed low quality content, rewrote passable content to make it better, wrote high quality content, got lots of likes and shares on social networks, etc. Now this morning I see that out of 1252 pages submitted, only 691 are indexed. Is that a temporary situation related to the recent updates? Anyone seeing this? What should I interpret about this?
Technical SEO | | sbrault740 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
How to attach at text to image that other websites use from my website
I often have other websites link to my website. They will do this with an image that they pull off of my website. (actually my website continues to serve the image). These inbound links are great, but they don't have alt text. Is there a way for me to attach alt text to the images, or is this something the other website needs to code themselves?
Technical SEO | | EugeneF0