Negative SEO attack, just keep disavowing?
-
Hello,
Around 2 months ago someone started a negative SEO campaign against us. Each week in Majestic around 50-60 domains appear (all .biz or .eu) which link to our site in hidden code via the exact match keyword. Now luckly nothing has happened to our rankings, as i have been disavowing all those links as soon as they appear in Majestic. (google only shows a few of them, and Google webmaster forum told me that google only shows a "sample of links" and that we should disavow as soon as we see)
So only thing for me is to monitor majestic each week and keep on disavowing. I think there are almost 250 domains to this date i have disavowed. Or should i still only disavow those that google shows? (I think not, as those are "sample links")
-
Hi Richard,
Definitely, you shouldn't worry about spammy links. As you keep spotting them, throw that list into the disavow file.
There are some comments from an authoritative person in the community: Marie Haynes.
Disavowing in 2019 and beyond – the latest info on link auditingHope it helps.
Best luck.
Gaston -
I've got over 5000 in my disavow file. When it all started I tried to keep up with it, auditing my GWT links and using AHREFS. I finally stopped doing it. Some of the targeted pages are still doing good in the SERPs but its always in the back of my mind these those spammy links are holding the site back some. While for several years there was steady growth in traffic, that growth has slowed significantly. I think it is partly due to more competition in my niche and some dirty rotten scoundrel building those spammy links to my site.
-
In my opinion those are 2 different things, one is diminishing its linkjuice and onother is showing that website backlinks in the Search Console profile.
-
Yes, they say don't worry, but in our case many bad domains shows up in google website link profile, so it looks like they get through their algorithms and need to be disavow manually.
-
We are in the same boat; one of our website has been hit with negative SEO for at least last 6 month. Every week we have been getting links from bad domains(we use ahrefs to identify them), all kinds… some even disturbing. They link to home page, specific landing page or some link to images. We are currently having almost 1000 bad domains in google disavow tool.
-
Hi advertisingtech,
Yeap, as lon as you clearly identify those links as spammy and/or really harmful.
Google said (through their people) several times that you should not worry THAT much about spammy links, today's algorithms are really good at detecting and not considering them.
The latest resource: Glenn Gabe tweet_1 and Tweet_2. Also useful to watch that whole Webmasters Office-hours HangoutAlso just a reminder, do not rely only in one tool, try to (if possible) complementate with other tools, such as ahrefs, SEMrush, Moz OpensiteExplorer or others.
Hope it helps.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any SEO benefits of adding a Glossary to our website?
Hi all, I manage a website for a software company. Many terms can be quite tricky so it would be nice to add a Glossary page. Other than that, I have 2 questions: 1. What would be the SEO benefits? 2. How would you suggest to implement this glossary so we can get as much SEO benefit as possible (for example how would we link, where would we place the glossary in the terms of the sitemap, etc.). Any advice appreciated! Katarina
Technical SEO | | Katarina-Borovska2 -
Should I keep a website which is outdated or close it down? It has a few links. If I keep it can I redirect people to our newer site?
We are in the process of buying some intellectual property, and it's websites are very dated and only have around 5 external links each. What's the best course of action? Do we close down the sites; then redirect the urls to our current website, or do we leave the sites up, but redirect people to our new site. Reference: current website: www.psychometrics.com Old sites that come with the intellectual property: http://www.eri.com/ plus http://www.hrpq.com/ Thanks, Dan Costigan
Technical SEO | | dcostigan0 -
Loading images below the fold? Impact on SEO
I got this from my developers. Does anyone know if this will be a SEO issue? We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
Technical SEO | | KatherineWatierOng1 -
Should i put an introduction on my site for better seo
Hi, my site is www.in2town.co.uk and we have been having serious problems with our site since our upgrade. Before the upgrade, our site was on the first page of google for many years and most of them we were in the top three, now however since the upgrade of our site we have had serious problems. The main keyword for our site was lifestyle magazine, but now after checking this keyword we are now on page 12, this has gone down over the past couple of days. we had an introduction and information about our magazine on the home page, trying to rank for the keyword lifestyle magazine, we were jumping from page nine to six to ten, we took the intro off as a developer said that we had done enough to rank well for the keyword without putting any text with the keyword on our site, but since taking it off, we have dropped down more in the rankings. I would love to hear from people on their thoughts on this and if we should put back the intro at the top or where they feel we should put it and should we put about the magazine at the bottom. I was always taught, if you want to rank well for a keyword then as well as doing all the linking, you also need to make sure you optimize the page with the keyword looking forward to hearing your thoughts on this
Technical SEO | | ClaireH-1848860 -
What is the value in Archiving and how can I avoid negative SEO impact?
I have been very busy reducing GWT duplicate content errors on my website, www.heartspm.com, created on a Wordpress platform. Each month, blog entries are being archived and each month is generating a duplicate description by Google. We post 2-3 blog entries per month and they don't really go out of date. Most are not news related butr rather they are nuggets of information on entomology. Do I need to use the archiving feature? Can I turn it off? Should I switch to archive perhaps once per year instead of every month and how is that done? How do I stop Google from creating its' own meta-description, duplicates each month for these archive entries? Should I have the archive as NOINDEX, FOLLOW? I'm not the programmer, but I have some technical know how, so I have a lot of half baked ideas and answers that could use some polishing. Thanks for your help and suggestions. Gerry
Technical SEO | | GerryWeitz0 -
Can anyone recommend an SEO friendly Joomla extension to use for SEO on an existing website?
Can anyone recommend an SEO friendly Joomla extension to use for SEO on an existing website? I have downloaded sh404sef but I don't want to change my URL's and make them longer than they are at the moment. Any ideas?
Technical SEO | | Karen_Dauncey0 -
Geotargeting by IP and SEO
Hi, Part of our site displays localized results based on the user's IP (we get the zipcode based on IP). For example a user in NY would get a list of NY based stores, while a user in CA would get a list of CA based stores. So if CA Googlebot comes to our site, it will get results based on Mountain View CA. Given the pages are generated based on your zip, I'm not sure how we'd indicate to Google that we have results for lots of locations and not just the Googlebot IP locations. (users can change their zipcode, but by default we use geolocation). Our landing pages contain localized content and unique urls with the zipcode etc, but it isn't clear how Google will find results for KY etc.
Technical SEO | | NicB10 -
Competiting In the Small Business SEO Market
I deal with primarily small businesses in the construction and maintenance industries and I'm looking for some advice. Traditionally in these categories you find absolutely awful websites with the below attributes, all ranking 1-5 in the SERPs. Weak/Limited content No blog Awful title tags Minimal backlinks Poor on-site optimization Ect. In most cases I am going to assume that these sites have been indexed for the last ten years, and that is why they are retaining such high rankings. My issue is trying to compete with them! So far I have worked to have my client's website submitted to and accepted by all of their competitor's backlinking site (which didn't take as long as you might think). I've also listed my client in several of the top paying and free directories (dmoz, joe ant, ect.) and still I see limited results. Lastly and most annoyingly my clients website, according to SEOmoz is currently leading the domain authority race in every category. Does anyone have a suggestion as to what is going on.
Technical SEO | | calin_daniel0