Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Negative SEO - Case Studies Prove Results. De-rank your competitors
-
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website.
I could have swore I heard Matt Cutts say this was not possible, well the results are in.
This really opens up a whole new can of worms for google.
http://trafficplanet.com/topic/2369-case-study-negative-seo-results/
http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/
This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
-
Yeah I know, I read the full forum thread... Im not sure what you mean?
Edit: Do you mean your post was not about the actual site it effected but more about Negative SEO? If so, we understand that, thats why we said it was scary. We were just pointing out the other issues in the forum thread.
-
Its nothing got to do with the site in question. Its purely negative SEO. Read the case study about "Dan Thies" who says he is the SEO to Gurus and knows Matt Cutts on a personal basis. They targeted his website and got it off the ranks in about 1 week. He was ranked page 1 for "SEO" and "SEO services" now he is nowhere to be found.
-
Oh absolutely, the backlinks generated from the story will make it worthwhile. Not sure if it was intended but it will benefit the site in the long run Im sure. Also, they did state a few times that his website was non-profit so it certainly wont hurt his lifestyle.
Yeah I think that thread will be closely watched by big G... Probably not something to boast about on an online forum. However, he definitely knows his stuff so fair play to him - he has thought about it in great detail.
-
On reflection, this is the kind of story that is going to gain some momentum in the business/marketing world and will probably generate a bucket load of nice authority links for the site...could this be a double bluff?
either way mr negativeseo might have undone all his hard work simply by boasting about his exploits! I hope so.
-
Definitely. I think that example was way OTT and not needed for most websites. I think it looks like it would be pretty easy for small businesses with smaller websites.
Unfortunately there seems to be plenty of to55ers around. It worry's me that some people have no remorse... It could mean jobs lots and life's destroyed! Shame on Google!
-
Agreed Matt, that is scarey. though a years worth of work to deindex one competitor seems like a lot of hard work for little gain.
I'd rather spend the year doing something positive and not being a complete git.
After looking over the site in question, it may not be down to the negative SEO, theres an awful lot that could be improved on that site.
-
Scary stuff indeed...
I don't care how effective it may or may not be, I could never do this to a competitor... My motto would always be to beat them on merit and not cheating.
I have no doubt that this currently works but I also have no doubt that Google will close the loophole and instead of devaluing a website for "un-natural links" it will simply ignore the links - it is only a matter of time.
Lets cross our fingers that we don't have immoral competitors until that time comes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
White Hat / Black Hat SEO | | edmundsseo0 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
info@seoptimization.pro0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0