Site was infected with spam webmaster tools still reporting it
-
I have recently been working with a site that was hacked. It suffered from a pharma injection into Joomla. The site has been cleaned for several months, but WMT is still reporting "pharmacy" as occuring 421 times. The url it gives reports a 500 error. I also removed it in Google. Can this still be hurting the site? How can I clean this up?
-
Hi John,
Question: When you say WMT reports 421 instances of "pharmacy" - where do they report this? In your inbound anchor text, content keywords, or somewhere else?
Also, does Google report any malware on your site? I couldn't tell from the question.
Regardless, if you're unsure about any lingering hack, I'd use a tool like Securi to do a site check. If you're site is indeed clean, I wouldn't worry too much about the latent data in Webmaster Tools. But if you're site was infected for a long time, you can try filing a reconsideration request which might prompt a set of human Google eyeballs on the site. If you site is clean, you can also perform a "crawl as google" and submit all linked pages to the index.
Hope this helps! Best of luck.
-
Hi John,
I agree with Martijn. A client's website suffered from Malware infection a couple of weeks ago. As a result GWT reported a Malware infection on the site aswell as all search results on Google are marked with a note that this website is infected.
After I cleaned the website up, I went back to GWT and clicked on the error message. I marked all errors as fixed. This tells Google to check your website ASAP. Just fyi: ASAP is in this case a definition which can't be precise as it's up to Google.
Anyway, I think the reason why it takes so long for you ould be:
- You still have infected files. Make sure your website is really clean!
- Your website is clean by the crawling algorythm is very long so it takes ages until you get rid of the error message. Best solution here: Mark your errors as fixed.
Hope this makes it for you.
-
Hi John,
Overall my experience is that Google Webmaster Tools is sometimes really slow with refreshing the data and mostly when your problems occurred a couple of months ago. You could try to mark the items as fixed in GWT, this sometimes makes it easier to see new errors.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tools/Software that can crawl all image URLs in a site
Excluding Screaming Frog, what other tools/software to use in order to crawl all image URLs in a site? Because in Screaming Frog, they don't crawl image URLs which are not under the site domain. Example of an image URL outside the client site: http://cdn.shopify.com/images/this-is-just-a-sample.png If the client is: http://www.example.com, Screaming Frog only crawls images under it like, http://www.example.com/images/this-is-just-a-sample.png
Technical SEO | | jayoliverwright0 -
Why are these blackhat sites so successful?
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:" http://dentalimplantsvaughan.ca - 9 (on google.ca) http://dentalimplantsinhonoluluhi.com - 2 (on google.com) http://dentalimplantssurreybc.ca - 7 (on google.ca) These markets are not particularly competitive, however, all of these sites suffer from: Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier). Average speed score. No structured data No links And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago. But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
Technical SEO | | nowmedia10 -
Site (Subdomain) Removal from Webmaster Tools
We have two subdomains that have been verified in Google Webmaster Tools. These subdomains were used by 3rd parties which we no longer have an affiliation with (the subdomains no longer serve a purpose). We have been receiving an error message from Google: "Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.00%". I originally investigated using Webmaster Tools' URL Removal Tool to remove the subdomain, but there are no indexed pages. Is this a case of simply 'deleting' the site from the Manage Site tab in the Webmaster Tools interface?
Technical SEO | | Cary_PCC0 -
Webmaster Tools finding phantom 404s?
We recently (three months now!) switched over a site from .co.uk to .com and all old urls are re-directing to the new site. However, Google Webmaster tools is flagging up hundreds of 404s from the old site and yet doesn't report where the links were found, i.e. in the 'Linked From' tab there is no data and the old links are not in the sitemap. SEOmoz crawls do not report any 404s. Any ideas?
Technical SEO | | Switch_Digital0 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1 -
Will 301 redirecting a site multiple times still preserve the original site value?
Hi, All! If site www.abc.com was already 301 redirected to site www.def.com, and now the site owner wants to redirect www.def.com to www.ghi.com - is there any concern that it's not going to work, and some of the original linkjuice, rank, trust, etc. is going to vanish? Or as long as the 301s are set up right, should you be able to 301 indefinitely? Does anyone have any experience with actually doing this and seeing good/bad/neutral results? Thanks in advance! -Aviva B
Technical SEO | | debi_zyx0 -
Why do I see dramatic differences in impressions between Google Webmaster Tools and Google Insights for Search?
Has anyone else noticed discrepancies between these tools? Take keyword A and keyword B. I've literally seen situations where A has 3 or 4 times the traffic as B in Google Webmaster Tools, but half the traffic of B in Google Insights for Search. What might be the reason for this discrepancy?
Technical SEO | | ir-seo-account0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0