How do you deal with comment spam: wordpress?
-
I have akismet installed on my Wordpress blog, and it does a great job of filtering the spam comments, but for some reason my site (and server) gets slammed by the amount of spam comments akismet blocks. If I check my spam folder there will be over 100 spam comments in an hour. (which in turn puts a load on my server.)
Does anyone have any thoughts on how to put a stop to this? (Or at least slow it down?) I know I could use a captcha, but I really don't want to put any barriers on people commenting and I don't even like using those captcha's myself.
Thoughts?
By the way, does anyone know how spam like this works? This has been going on for sometime now. Are spammers just using automated software to do this?
-
By far the best site availability monitoring tool I can recommend is Pingdom.
Signing up for an account is free to monitor one website. You can have it email you or send a text message/tweet when your site goes down. You can also configure how long your site must be out before you get alerted, and how often to be alerted while your site is still down.
Indispensable for understanding what's actually going on with your site.
Paul
P.S. Use the customizations when setting up the monitor so it's actually checking for the existence of a particular word on your page - that way you're testing whether your site is actually rendering, as opposed to just responding to a ping.
-
Thanks for the reply, very helpful info.
As far as server monitoring I don't think I have anything in place. Any suggestions?
-
Sorry - guess should have made that clearer, Rick. There will be a definite reduction in server resources used. The comment still gets partially processed in order to send it to Akismet, but that new setting tells your system to just discard it if it comes back marked as spam. That way, no database writes occur for that spam, which will definitely reduce server load (database reads and writes are fairly "expensive" in terms of added server processing needed).
Without that setting, spam comments that come back from Akismet get written to your database under the Spam table. That's a lot of extra processing for something you were going to throw out anyway.
This won't save as many resources as actually blocking the spam before it even starts to get processed (as the other suggested plugin would do) but you should notice lowered demand on your server resources with this setting. Not to mention a whole lot less crap to clean out every day, as you point out
Paul
P.S One side effect to that setting is you won't be quite as aware of just how much spam you're actually getting since you won't see a lot of it. This means a spam run against some older posts could start really hitting server resources hard but you might not be aware. (Remember, this setting doesn't eliminate the processing demands completely.)
So keep an eye on the stat that shows how many spams Akismet has handled. If you see a prolonged surge, and/or have further server load problems, it will be a signal that more drastic protection methods have become necessary.
Do you have a server monitoring/alerting system in place?
-
Thanks for your very helpful post! It was great!
I never thought of selecting the option to auto delete spam comments on posts older than a month old. Once I did that, it cut down on 80% of the spam I was getting! So thanks!
Quick question on that. Does enabling that option cut down on the server resources? In other words, let's say it cuts down on 200 spam comments a day because they are auto deleted, do those 200 spam comments still get entered in as comments and therefore use server resources? Does this just save me the step of having to go through and delete them / clear the spam folder? Or does this save a huge amount of server resources? Either way it's a huge win!
-
Just wondering if these responses helped answer your question, Rick?
If not, what else might you need clarified tht we may be able to help with?
Paul
-
Ahh... comment spam - the bane of every successful website with an active blog. It's actually a signal of your success that your getting that much spam
I fully agree though - captcha is NEVER the answer if you want to maintain high visitor engagement. You shouldn't be offloading your spam problem onto your visitors to solve. There are better options.
So let's dive in.
How the spam gets generated There are two types of comment spam: bot-generated and manual. The first is created by software "bots" that have been programmed to crawl the web looking for the scripts on a website that allow content submission e.g. comment forms, contact forms etc. The software then accesses the script directly and submits its crapload. WordPress (and othe CMSs) are especially vulnerable because these scripts have the same names on every single install - the bot only has to look for a few very specific filenames in a few standard places.
Because this is two pieces of software talking directly to each other, hundreds, or even thousands of submissions per hour can be generated. The bots generally have no limits on them, so eventually they'll consume so many server resources they degrade or even completely consume the server's ability to do the rest of it's job. (This is considered to be at least 65% of all spam.)
With manual spam, an actual human in a very cheap labour market is paid to go through the posts on a website and manually enter the crapload, entering whatever info into the fields is necessary to make the comment system think it's a legit human-generated comment.
Filtering vs Blocking
The problem with Akismet is that it is a spam filtering tool, not a spam blocking tool. Each comment is allowed to enter the blog system where it is then sent to Akismet's server to be assessed. Akismet then sends it back to your site flagged to go into your spam, moderation, or publication queue. This means each spam message receives the same processing as legit comments, so the system is still using processing and database resources for every single message received. (Even spam gets written to the database and stays there until you decide it should be deleted.)
All very processing intensive, and hence why having Akismet doesn't do anything to reduce the server load of a spam run - and may even increase it slightly.
Optimize Akismet's Settings
So what to do? First, there's a simple checkbox in Akismet settings that can make a huge difference. You can tell Akismet that if it recognizes as spam a comment to a post that's more than a month old, it should just automatically discard it instead of adding it to the spam queue and writing it into the database. This greatly reduces the database activity created by the spam, and also helps keep your spam queue clearer so it's easier recognize legit comments that might have been caught from more recent posts. (Spammers tend to focus on older posts for a number of reasons - mostly becasue they're easier to find) The clear disadvantage is that the (very) few comments falsely identified as spam will be irretrievably gone. I know this could be an issue for you as many of your posts continue to get comments for months after, but if you're clearing 100s of comments an hour, chances are that some legit comments are accidentally getting deleted already.
To enable the automatically discard function, simply go to the Akismet Configuration page under your Plugins (where WordPress.com API Key is entered). At the bottom of the page, check-mark the box for Automatically discard spam comments on posts older than a month. Remember to click the Update options button when done.
Stronger Protection
If you need more protection, you're going to need to install a plugin that intercepts the comments before they get into the system and automatically discards the ones that show the characteristics of bot-submission behaviour. Essentially the plugin analyzes how the comment was posted, rather than its content.
The best-known of these is Bad Behaviour, but it's a pretty heavy-handed solution that has been known to even block GoogleBot and hence cause deindexing of many pages. I'd call it a last-ditch solution.
I'd suggest you try WP Captcha-Free which is a small, very lightweight plugin that invisibly creates a "hash" when the comment is created that must also be present when submitted. Most spam-bots fail at this since they're submitting directly to the comment script and so are blocked before the comment really starts processing. Note that commenters must have Javascript enabled using this system. Since only 2-5% of web users don't have JS enabled, this is a reasonable tradeoff (and much better than pissing of 100% of your commenters by enforcing a captcha)
You will still want Akismet active behind this protection to catch the manually-submitted spam.
So to recap - in order to reduce your server load from spam, you need a system that BLOCKS the spam before it starts to get processed in the first place. Just doing more/better FILTERING won't help as the filtering process actually uses up even more server power. You want your server only processing what is likely to be real comments.
Sorry for the loooong reply but comment spam is a big/complicated issue and if it's approached incorrectly, you can make your problem much worse instead of better.
Fire away with the questions
Paul
-
Your blog will have a 'hook' where scripts can automatically insert comments to your site. Check your server logs - you'll probably see one form or another getting hit, a lot, or a script like xmlrpc.
If it's a form, add CAPTCHA, and that'll stop the scripts from auto-submitting.
If it's something else, consider changing permissions so the whole world can't hit it.
-
You may want to check your GA to see if this started all at once. Your site might be under a DDoS attack, but your server is holding up. That sounds like an awful lot of blocking in one hour. Sometimes the ISP has to step in if you host the server inhouse.
-
Is your blog self hosted or hosted by Wordpress.com ? If it's the one in your profile, then it's hosted by Wordpress.com which would then limit your abilities but then probably shouldn't have server "stress" issues. If it's self hosted, then you can try some of these plugins and see if they help (http://wordpress.org/extend/plugins/search.php?q=spam) Depending upon the software being used to spam your blog, one of these for sure will be able to reduce the issues. So yes, it's automated tools that do all this comment spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has anyone transferred a site from WordPress to Webflow?
We're thinking about making the move, but I'm (mildly) concerned about SEO implications.
Web Design | | lauraballer0 -
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Opencart vs. Wordpress/Woo
There are two issues facing me today. One is that my two e-commerce stores need updating after some 4 years, but I am seriously considering switching from Opencart to Wordpress/Woo. Opencart is a nightmare to work with at the best of times. Whenever I try to edit the footer of my current sites for instance nothing changes, the customisation of pages is sloppy and although the site works fine for perhaps the first 6 months, anytime after that it just slowly falls apart. Wordpress however features incredible customisation, is easy to edit the code but it lacks the backend functionality that Opencart is good at. Does anyone know the downsides of changing to Wordpress/Woo in respect to SEO?
Web Design | | moon-boots0 -
On-page SEO opinion on this Wordpress theme
Hi everyone. As an SEO agency we've been moving more toward genesis themes, however we have a client who really wants to redesign his website using the following theme: http://themeforest.net/item/this-way-wp-full-videoimage-background-with-audio/943634 - the theme would be images with no audio on the homepage. He is a remodeling contractor and likes the design and functionality of the theme. I'd like to get others feedback and opinions on what you think about the on-page SEO of this theme? Thanks.
Web Design | | WillWatrous0 -
How do I deal with old software on my server?
I am currently running an old software called gallery2 on my website. I used it for years to showcase some of the work I have done. I switched over to Next Gen gallery some time ago, but I still have my gallery 2 online. Gallery 2 isn't really available on my website anymore unless people find it through search engines. The only reason I didn't delete it years ago was because I didnt know how that would effect my rankings. At the moment its not working...and instead of figuring out why, Id like to go ahead and remove it. How should I deal with any dead links? The problem is that I had a few hundred photos on gallery, so there will be an overwhelming number of dead links. I was thinking of making a new page on my site and just put all of my NextGen galleries on the page... and then just 301 everything that was located at /gallery2 to my new gallery page. Any suggestions would be greatly appreciated. Thanks! Tim
Web Design | | Timvroom0 -
How import are breadcrumbs SEO wise on a wordpress blog?
I was recently told I should take the breadcrumbs off of our site, for if no other reason than that it would look much nicer, and I tend to agree. I was curious how much seo weight breadcrumbs add to a site, and if I would take a big hit if I removed them... Thanks!
Web Design | | NoahsDad0 -
Random 302 Redirect (Wordpress CMS)
So this new project that i am working on is a redesigned CMS site using Wordpress (php based). Before i started on the project they made a few major updates that include: Changed from ID based URLS to SEO friendly URLS Added multiple languages in directories www.domain.com/en www.domain.com/fr www.domain.com/de etc... Due to the new languages they wanted to have the previous home page, www.domain.com, redirect to the proper language based on their IP address. Currently they are using a 301 redirect through a php header. So if i was visiting the site from my hometown of Cleveland, Ohio and i type www.domain.com into the browser. I would automatically redirect to www.domain.com/en and i would have the option to switch languages if needed. The issue: When i first added their site into SEOMoz the crawl returned a large amount of 302 redirects coming from their old homepage www.domain.com. So i took a look at the header calls using IE's webmaster tools and Firebug in Firefox. In both profilers it showed the same problem. Before the 301 header redirect there was a 302 redirect called first When viewing the response header it mentioned an x-pingback of some file that didn't even exist on the site: www.domain.com/xmlrpc.php This is obviously a huge issue because any link value from the old homepage will be lost due to the 302 not passing the value. I have tried search the almighty Google for help but it has gotten me no where. I have a hunch it is something to do with Wordpress but that is based on nothing but my gut. Any help is greatly appreciated. I got to get that 302 gone or changed to a 301 🙂 Regards - Kyle
Web Design | | kchandler0 -
Switched From Wordpress, Traffic Dropped In Half
Hello, Thank you for taking a look at my issue. My site: http://www.getrightmusic.com A month ago, I switched from Wordpress to ExpressionEngine. The reason being I wanted a more powerful membership functionality with media uploading. After I switched, my traffic basically dropped in half. I was averaging around 4-6,000 unique visitors per day and now I am at about 2,000 per day. I resubmitted a new sitemap to Google webmasters. I also set up 301 redirects on my top 80 urls that were ranking well and driving traffic in Google. Not only did Google kick me off of my top spots in the SERP's, but I no longer get indexed as quickly as I used to. With the old Wordpress site I would get url's indexed within minutes. Now they aren't even getting indexed really at all. Is this a normal occurrence when switching site designs and systems? Do you think Google will just take a little time before they give me back some respect? Is there anything I should be doing to get back to ranking and getting indexed faster? Thanks for any help or any insight you may have. Jesse
Web Design | | getrightmusic0