750,000 pv/month due to webspam. What to do?
-
Let's say your user-generated content strategy is wildly successful, in a slightly twisted sense: webspammers fill it with online streaming sports teasers and the promise of "Weeds season 7 episode 11." As a result of hard SEO work done to build the profile of the domain, these webspam pages seem to rank well in Google, and deliver nearly 750k pageviews, and many many unique visitors, to the site every month.
The ad-sales team loves the traffic boost. Overall traffic, uniques, and search numbers look rosy.
What do you do?
a) let it ride
b) throw away roughly half your search traffic overnight by deleting all the spam and tightening the controls to prevent spammers from continuing to abuse the site
There are middle-ground solutions, like using NOINDEX more liberally on UGC pages, but the end result is the same as option (b) even if it takes longer to get there.
-
You seem to have a clear understanding of the situation. You are making the conscious choice to continue with your current business practices. It makes sense.
You have a monetary incentive to capture as much traffic as possible due to advertising revenue. As EGOL suggested, I believe the best paying advertisers will recognize your traffic as low quality and either choose not to advertise on your site or pay substantially less then they would for a similar ad on a better site.
You also run the risk of losing many users. Humans don't like spam sites and will leave them for better sites. Additionally Panda changes will surely make it harder for your site to rank on it's legitimate content.
Feel free to disregard this advice. I predict at some point in the not-to-distant future you will lose your advertisers or your traffic. The amount of effort you spend trying to get either back will ensure you never travel down this path again.
-
Ryan - not half the site's traffic, but half the site's search traffic. And even that is an exaggeration. Webspam search traffic accounts for 28% of overall search traffic.
EGOL - I would say no to the question of robot visitors, because on the instances we checked -- in which spammers used a bit.ly URL for their outbound link -- we were able to measure an astounding 47% clickthrough rate from our site to the spam destination. I would not expect bots to click through.
Also, we use nofollow on all outbound links in user-generated content. I guess that is not a guarantee that we would not be penalized fro hosting a linkfarm, but shouldn't it be?
If it were up to me, I'd wipe out the webspam entirely. But it's not an easy sell. This content delivers ~750,000 pageviews, ~150k ad views, and probably 100k unique visitors per month, plus the small risk that one day G might penalize us for it. It's not pills, porn, gambling, mortgages, and all the links are nofollowed. The people making this decision don't see a smoking gun.
-
I have two concerns....
Are you getting a lot of robot visitors instead of human visitors? If you are getting lots of robots then those visits will not be valuable to your advertisers and they will eventually stop paying to appear on your site. The best advertisers are really smart about this.
Are these sports teaser posts accompanied by links to other websites. If that is happening I would cut them off right away because they are probably making you a linkfarm for spammy websites.
-
The problem you face is by allowing spam, your real users will be unhappy. Your main site visitors may leave your site for another, spam-free site. It is likely you have already permanently lost some traffic due to the spam.
Presently you describe your site as 50% spam traffic, 50% real traffic. Two things will likely happen over time. Google will recognize your site is spammy and will penalize it in some format. Also your users will become unhappy with your site and the ratio of your site's visitors will change to being more spam traffic. Once that happens, I anticipate a fast decline.
I suggest option B as in your best interests for long term benefit of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
How/Why do I have so many Spam backlinks?
I was looking in GWT yesterday and found we have several thousand "spam" backlinks...I am curious why this happens and how this happens? There are some links from websites/domains that are not mine that appear to be spam. However, we own a large group of domains and have noticed some of the links are coming from 2 of those sites/domains we own to my main site. The sites/domains are not active, we just own them. I am wondering how someone could access these domains that are not active and create spammy backlinks to my main website? (They created about 20,000 links). Thanks.
White Hat / Black Hat SEO | | carlystemmer0 -
Re-Post: Unanswered - Loss of rankings due to hack. No manual penalty. Please advise.
Sorry for reposting, but i must have accidentally marked this as answered. I am still seeking advice/solutions. I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
White Hat / Black Hat SEO | | digitalimpulse0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Changing domains from .net to .com after 7 month of traffic loss.
We are in business since 2005 and we always used the .net version as it was the only one available when we started. In about 2007 we bought the .com version to the person who owned it but we kept using the .net as customers were already used to that version. In January we started to see a SE traffic loss, not to mention being outranked by several sites (95% of those site spammers). We had no manual penalty but it could be an algorithmic, we are not sure if we even have some sort of penalty or is just that our niche is too spammed. We are now considering moving the site to the .com version as all our tries of increasing and regaining our ranks were useless (backlink cleanup, disavow tool usage, excellent link building, excellent content creation and social interactions). Our DA and PA are both higher that any of the other ages ranking on top. We have about 3k pages indexed. What do you guys think? Should we move the site to the .com? (note that the change is ranking-wise, not in terms of branding). And if we do, should we 301 all pages? or rel=canonical to avoid a possible "penalty flow" to the other domain? Note: for years, the .com version was/is 301 to the .net one. Thank you all!
White Hat / Black Hat SEO | | FedeEinhorn0 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
Site architecture change - +30,000 404's in GWT
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future. But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated. Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
404checker.com / crawl errors
I noticed a few strange crawl errors in a Google Webmaster Tools account - further investigation showed they're pages that don't exist linked from here: http://404checker.com/404-checker-log Basically that means anyone can enter a URL into the website and it'll get linked from that page, temporarily at least. As there are hundreds of links of varying quality - at the moment they range from a well known car manufacturer to a university, porn and various organ enlargement websites - could that have a detrimental effect on any websites linked? They are all nofollow. Why would they choose to list these URLs on their website? It has some useful tools and information but I don't see the point in the log page. I have used it myself to check HTTP statuses but may look elsewhere from now on.
White Hat / Black Hat SEO | | Alex-Harford0