Yet another Negative SEO attack question.
-
I need help reconciling two points of view on spammy links.
On one hand, Google seems to say, "Don't build spammy links to your website - it will hurt your ranking." Of course, we've seen the consequences of this from the Penguin update, of those who built bad links got whacked.
From the Penguin update, there was then lots of speculation of Negative SEO attacks. From this, Google is saying, "We're smart enough to detect a negative SEO attack.", i.e: http://youtu.be/HWJUU-g5U_I
So, its seems like Google is saying, "Build spammy links to your website in an attempt to game rank, and you'll be penalized; build spammy links to a competitors website, and we'll detect it and not let it hurt them."
Well, to me, it doesn't seem like Google can have it both ways, can they? Really, I don't understand why Competitor A doesn't just go to Fiverr and buy a boatload of crappy exact match anchor links to Competitor B in an attempt to hurt Competitor B. Sure, Competitor B can disavow those links, but that still takes time and effort. Furthermore, the analysis needed for an unsophisticated webmaster could be daunting.
Your thoughts here? Can Google have their cake and eat it too?
-
If it can be proven that the intention was to cause harm to another companies profits I would think you could be held liable. There is enough documentation on the web to show that Google penalizes for bad links and that negative SEO exists, if there is proof that you were doing what Google tells you not to do against your competition and it results in a penalty that Google says will happen, it seems like bad intentions can be proven and in that case you could be found guilty in a court of law. I am not aware of any precedents though.
-
Thanks, your reply helps keep this in perspective.
if it is proven that you created these links my guess would be
you could be held liable in court.This would be another interesting tangent discussion. Of course, the defense would be the first amendment right of freedom of publishing. In my feeble knowledge, I'm not aware of a court case that has encountered this issue, but it's an interesting legal question: Could you be held civilly liable for merely publishing links?
-
I completely agree with your comments Steve. Especially when it comes to a niche where there are only a couple of big companies and it's seasonal. If you can knock out the competitor during their busiest month of the year you've done major damage to them and have benefited yourself greatly. It's a horrible, shady practice and even though Google initiated the penalty, if it is proven that you created these links my guess would be you could be held liable in court.
-
Why is competitor A spending their time and money trying to harm Competitor
B whenthey can simply protect themself with the Disavow Tool Why not
spend those time and money on building quality links.Buying links on Fiverr = $5 and five minutes.
Disavowing links = a couple of hours of analysis or paying someone a bit of cash for the analysis.
So, it's easier to create the havoc, than to clean it up. I'm sure we're all on the same page that such a technique isn't ethical, doesn't help you build up your business, is bad business karma, and so on. But, is it feasible? Apparently so. Especially when the stakes are high, for Commerce sites, it seems like this would become a tempting strategy for the less ethically inclined.
-
There is no way that Google can know (unless you are intentionally transparent about it) if someone you paid or someone a competitor paid built those links for you. Negative SEO is very real but it takes time and money to get a site penalized, and now it's easier than it ever was to disavow links and get a site back which helps take some of the punch out of the negative SEO business.
-
Hi Steve,
I think I see your point. However, if Competitor A buys low quality links to Competitor B, yes, they can use the disavow tool to remove the links and it will still take time for them to do so and effort but what is the point in this. Why is competitor A spending their time and money trying to harm Competitor B when they can simply protect themself with the Disavow Tool Why not spend those time and money on building quality links.
Competitor A is simply wasting time and money to buy links where Competitor B is spending time and effot to remove them. I don't see why anyone would do that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a negative consequence of recycling guest posts?
I have an SEO campaign, where I have about ~100 target websites. I have an article on a specific topic, that is relevant to their industry and mine. The article links back to my website, and in exchange for posting the article the owner of the target website receives compensation in free services from our company. The topic is very specific to the marketing campaign and the compensation model. It is not possible for this particular campaign to create other topics for articles. If several websites host the exact same article that links to my website: Is there a negative SEO consequence for the target websites? Is there a negative SEO consequence for my website? If yes to either of these questions, just how different would the post need to be to avoid the answer being yes to either of these questions?
White Hat / Black Hat SEO | | deweydecibel0 -
I redesigned a clients website and there is a pretty massive drop in traffic - despite my efforts to significantly improve SEO.
Hi there, I redesigned a clients website that was very old fashioned and was not responsive. I implemented 301 redirects, kept the content pretty similar, website linking structure very similar - the only things i changed was making the website responsive, improved title tags, added a bit more information, improved the footer and h1 tags etc.. however although clicks are fairly similar search impressions have dropped about 60% on average over the past week. The old site had some keywords linking to pages with no new content so i removed those as seemed like black hat seo tricks and also there was a huge list of "locations we deliver to" on the homepage followed by around 500 citys/towns I removed this. Could this be the cause for the drop? as i assumed those would do more harm than good? Fairly new with SEO as you can probably tell. Looking for advice on what may be the cause and what steps I should take now. Thanks for reading! duGeW
White Hat / Black Hat SEO | | binkez321 -
Effect of same country server hosting on SEO
Hello all, my question is if my website targets a country abc and I have server in the same country abc compared to suppose I shift my server to country xyz can it effect SEO and ranking of my website ?
White Hat / Black Hat SEO | | adnan11010 -
G.A. question - removing a specific page's data from total site's results?
I hope I can explain this clearly, hang in there! One of the clients of the law firm I work for does some SEO work for the firm and one thing he has been doing is googling a certain keyword over and over again to trick google's auto fill into using that keyword. When he runs his program he generates around 500 hits to one of our attorney's bio pages. This happens once or twice a week, and since I don't consider them real organic traffic it has been really messing up my GA reports. Is there a way to block that landing page from my overall reports? Or is there a better way to deal with the skewed data? Any help or advice is appreciated, I am still so new to SEO I feel like a lot of my questions are obvious, but please go easy on me!
White Hat / Black Hat SEO | | MyOwnSEO0 -
Is Yahoo! Directory still a beneficial SEO tactic
For obvious reasons, we have submitted our clients to high authority directories such as Yahoo! Directory and Business.com. However, with all of the algorithm updates lately, we've tried to cut back on the paid directories that we submit our clients to. Having said that, my question is, is Yahoo! Directory still a beneficial SEO tactic? Or are paid directories, with the exception of BBB.com, a bad SEO tactic?
White Hat / Black Hat SEO | | MountainMedia0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Seo style="display: none;" ?
i want to have a funktion which shortens text in categorie view in my shop. apple is doing this in their product configurator see the "learn more" button at the right side: http://store.apple.com/us/configure/MC915LL/A apple is doing this by adding dynamic content but i want it more seo type by leaving the content indexable by google. i know from a search that this was used in the past years by black had seos to cover keywordstuffing. i also read an article at google. i beleive that this is years ago and keywordstuffing is completly no option anymore. so i beleive that google just would recognise it like the way its meant to be. but if i would not be sure i would not ask here 🙂 what do you think?
White Hat / Black Hat SEO | | kynop0