750,000 pv/month due to webspam. What to do?
-
Let's say your user-generated content strategy is wildly successful, in a slightly twisted sense: webspammers fill it with online streaming sports teasers and the promise of "Weeds season 7 episode 11." As a result of hard SEO work done to build the profile of the domain, these webspam pages seem to rank well in Google, and deliver nearly 750k pageviews, and many many unique visitors, to the site every month.
The ad-sales team loves the traffic boost. Overall traffic, uniques, and search numbers look rosy.
What do you do?
a) let it ride
b) throw away roughly half your search traffic overnight by deleting all the spam and tightening the controls to prevent spammers from continuing to abuse the site
There are middle-ground solutions, like using NOINDEX more liberally on UGC pages, but the end result is the same as option (b) even if it takes longer to get there.
-
You seem to have a clear understanding of the situation. You are making the conscious choice to continue with your current business practices. It makes sense.
You have a monetary incentive to capture as much traffic as possible due to advertising revenue. As EGOL suggested, I believe the best paying advertisers will recognize your traffic as low quality and either choose not to advertise on your site or pay substantially less then they would for a similar ad on a better site.
You also run the risk of losing many users. Humans don't like spam sites and will leave them for better sites. Additionally Panda changes will surely make it harder for your site to rank on it's legitimate content.
Feel free to disregard this advice. I predict at some point in the not-to-distant future you will lose your advertisers or your traffic. The amount of effort you spend trying to get either back will ensure you never travel down this path again.
-
Ryan - not half the site's traffic, but half the site's search traffic. And even that is an exaggeration. Webspam search traffic accounts for 28% of overall search traffic.
EGOL - I would say no to the question of robot visitors, because on the instances we checked -- in which spammers used a bit.ly URL for their outbound link -- we were able to measure an astounding 47% clickthrough rate from our site to the spam destination. I would not expect bots to click through.
Also, we use nofollow on all outbound links in user-generated content. I guess that is not a guarantee that we would not be penalized fro hosting a linkfarm, but shouldn't it be?
If it were up to me, I'd wipe out the webspam entirely. But it's not an easy sell. This content delivers ~750,000 pageviews, ~150k ad views, and probably 100k unique visitors per month, plus the small risk that one day G might penalize us for it. It's not pills, porn, gambling, mortgages, and all the links are nofollowed. The people making this decision don't see a smoking gun.
-
I have two concerns....
Are you getting a lot of robot visitors instead of human visitors? If you are getting lots of robots then those visits will not be valuable to your advertisers and they will eventually stop paying to appear on your site. The best advertisers are really smart about this.
Are these sports teaser posts accompanied by links to other websites. If that is happening I would cut them off right away because they are probably making you a linkfarm for spammy websites.
-
The problem you face is by allowing spam, your real users will be unhappy. Your main site visitors may leave your site for another, spam-free site. It is likely you have already permanently lost some traffic due to the spam.
Presently you describe your site as 50% spam traffic, 50% real traffic. Two things will likely happen over time. Google will recognize your site is spammy and will penalize it in some format. Also your users will become unhappy with your site and the ratio of your site's visitors will change to being more spam traffic. Once that happens, I anticipate a fast decline.
I suggest option B as in your best interests for long term benefit of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The differences between XXX.domain.com and domain.com/XXX?
hi guys i would like to know which seo value is better? for example if i would put a link in xxx.domain.com or domain.com/XXX which one will give me a better seo value? does it give the same? assuming that domain.com have a huge PR RANK itself. why do people bother making XXX.domain.com instead? hope for clarification thanks!
White Hat / Black Hat SEO | | andzon0 -
Unique meta descriptions for 2/3 of it, but then identical ending?
I'm working on an eCommerce site and had a question about my meta descriptions. I'm creating unique meta descriptions for each category and subcategory, but I'm thinking of adding the same ending to it. For example: "Unique descriptions, blah blah blah. Free Overnight Shipping..". So the "Free Overnight Shipping..." ending would be on all the categories. It's an ongoing promo so I feel it's important to add and attract buyers, but don't want to screw up with duplicate content. Any suggestions? Thanks for your feedback!
White Hat / Black Hat SEO | | jeffbstratton0 -
How/why is this page allowed to get away with this?
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs. I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years. My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
White Hat / Black Hat SEO | | Millermore0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
No Follows - Sister/manufacturer sites
What is the best practice nowadays for linking to sister sites? Should you do it, shouldn't you, and/or should you list them with no follows? What about the reverse - having them link to us. Is this bad for us in anyway? Should we have them no follow their link to us? We are a distributor so manufacturers link to us as well, should we have them no follow their links? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Strange Pingback/Blog Comment Links
On one of my sites I've noticed some strange links from Google Webmaster Tools recent links feature. They are pingbacks/blog comments but they are using keyword anchor text and linking to my site. I know we are not doing this. Should I be concerned about this possibly being negative SEO? Here's a sample (be careful, shady site)
White Hat / Black Hat SEO | | eyeflow0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0