Negative SEO attack working amazingly on Google.ca
-
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc.
I submitted a disavow link file, as this was obviously an attack on the website.
Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now.
I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time.
'.'< --Thanks!
-
Thank you for that, i will have a look now. I know when we have looked at our links we have found a lot of people link to our site, but trying to decide if these are good links or bad links has been hard for us.
We did find a lot of directory sites and also game sites that were linking to us and we could not understand why they were doing this.
-
Zack,
They're actually linking to a dynamic inventory page which is updated daily with new products. Sometimes they link to a specific piece of inventory which is ultimately 404'd when the product is sold and link doesn't exist anymore.
Wish it was that easy, but the links keep coming
-
Hi Tim,
If you log into webmaster tools and look at Traffic > Links to your site you can see who's linking. Make sure you click on "More>>". Alternatively you can use Moz's OSE and run a backlink report on your site.
It was immediately clear that this was happening again as we had 26 links from videogamersoasis.com for a power sports website...
To submit a file:
Webmaster Tools > Traffic > Links to your site, click on "Download latest links" and then filter out any links you actually put there. Go to https://www.google.com/webmasters/tools/disavow-links-main?pli=1 and submit the file.
Only do this is you KNOW you've been attacked and you can't contact the website owners to take the links down.
-
i think the same is happening to our site, can you let me know how you find the spamy link and how to submit the file. also have you tried filling in the contact form for google, i know they are not great at communicating but maybe worth a shot
-
That sucks. If I were you I would probably would not submit a new disavow link file, because the first one is probably still in their queue.
How many different URLs were the spammy links pointing to ? If it's just a few, could you just no-index the tainted goods and create new URLs ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Google not disavow some bad links
I have submitted bad links that I want to disavow on google with the Moz Pro hight spam score. Its almost 4 months completed yet I have a bad link that exists with high spam score any solution? https://fortniteskinsgenerator.net/
White Hat / Black Hat SEO | | marktravis0 -
How to deal with spam heavy industries that haven't gotten the hammer from Google?
One of our clients works in the video game category - specifically, helping people rank higher in games like League of Legends. In spite of our trying to do things the right way with white hat link building, we've suffered when trying to compete with others who are using comment and forum spam, private blog networks, and other black hat tactics. Our question is - what is the right approach here from a link building perspective? Is it an "if you can't beat them, join them" or do we wait it out and hope Google notices and punishes those who don't play nice? Some test terms to see what we're up against: "elo boost" and "lol coach." Would love to hear thoughts from anyone who's dealt with a similar situation.
White Hat / Black Hat SEO | | kpaulin0 -
Ranking without SEO?
We have a client that we've been doing white-hat SEO for for over 3 years and they've always been number 1 in Google for all their targeted keywords. This year, their competition has been ranking above them and our client has been pushed towards the bottom of the first page. After thorough research, we discovered that this competitor is doing no SEO at all, just regular PR which our client is also doing. Our client is even spending money in Adwords and their competition isn't. Can anyone explain how a website that does zero SEO can magically be ranked at the top now and above our competitor who we're doing everything possible for?
White Hat / Black Hat SEO | | SEOhughesm0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Negative SEO campaign just started against my site. What do I do?
As the question says, I have just got alerts of new links, being clearly a negative seo campaign against my site. We are talking, lots of spammy, rude anchor text type keywords being used. Whilst I only have alerts of a small number (around 30), it has just happened and I know from the type of spammy links they are that more will be coming. So, question is, should I disavow? Do I keep submitting new disavows every few days as more are discovered? Any advice will be greatly be appreciated.
White Hat / Black Hat SEO | | jonathan790 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Google profile
I have a google profile https://plus.google.com/u/0/106631271958142100588/ wich is assigned to the url www.propdental.es but i also write a lot of content for to others url My question is if should i create another profile to the others urls witch are also mine but not associated between them. Or can i use the same profile without the risk of losing ranking on the weakest url, as they all compete for similiar keywords Thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0