Negative SEO attack, just keep disavowing?
-
Hello,
Around 2 months ago someone started a negative SEO campaign against us. Each week in Majestic around 50-60 domains appear (all .biz or .eu) which link to our site in hidden code via the exact match keyword. Now luckly nothing has happened to our rankings, as i have been disavowing all those links as soon as they appear in Majestic. (google only shows a few of them, and Google webmaster forum told me that google only shows a "sample of links" and that we should disavow as soon as we see)
So only thing for me is to monitor majestic each week and keep on disavowing. I think there are almost 250 domains to this date i have disavowed. Or should i still only disavow those that google shows? (I think not, as those are "sample links")
-
Hi Richard,
Definitely, you shouldn't worry about spammy links. As you keep spotting them, throw that list into the disavow file.
There are some comments from an authoritative person in the community: Marie Haynes.
Disavowing in 2019 and beyond – the latest info on link auditingHope it helps.
Best luck.
Gaston -
I've got over 5000 in my disavow file. When it all started I tried to keep up with it, auditing my GWT links and using AHREFS. I finally stopped doing it. Some of the targeted pages are still doing good in the SERPs but its always in the back of my mind these those spammy links are holding the site back some. While for several years there was steady growth in traffic, that growth has slowed significantly. I think it is partly due to more competition in my niche and some dirty rotten scoundrel building those spammy links to my site.
-
In my opinion those are 2 different things, one is diminishing its linkjuice and onother is showing that website backlinks in the Search Console profile.
-
Yes, they say don't worry, but in our case many bad domains shows up in google website link profile, so it looks like they get through their algorithms and need to be disavow manually.
-
We are in the same boat; one of our website has been hit with negative SEO for at least last 6 month. Every week we have been getting links from bad domains(we use ahrefs to identify them), all kinds… some even disturbing. They link to home page, specific landing page or some link to images. We are currently having almost 1000 bad domains in google disavow tool.
-
Hi advertisingtech,
Yeap, as lon as you clearly identify those links as spammy and/or really harmful.
Google said (through their people) several times that you should not worry THAT much about spammy links, today's algorithms are really good at detecting and not considering them.
The latest resource: Glenn Gabe tweet_1 and Tweet_2. Also useful to watch that whole Webmasters Office-hours HangoutAlso just a reminder, do not rely only in one tool, try to (if possible) complementate with other tools, such as ahrefs, SEMrush, Moz OpensiteExplorer or others.
Hope it helps.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any SEO benefits of adding a Glossary to our website?
Hi all, I manage a website for a software company. Many terms can be quite tricky so it would be nice to add a Glossary page. Other than that, I have 2 questions: 1. What would be the SEO benefits? 2. How would you suggest to implement this glossary so we can get as much SEO benefit as possible (for example how would we link, where would we place the glossary in the terms of the sitemap, etc.). Any advice appreciated! Katarina
Technical SEO | | Katarina-Borovska2 -
SEO trending down after adding content to website
Hi
Technical SEO | | swat1827
Looking for some guidance. I added about 14 pages of unique content and did all of the on page SEO work using Yoast - have 'good' status on all of them some of the website architecture was changed - mainly on one page. That being said, we got a significant bump the day I implemented, however every day thereafter we have had very bad results. Worse than we had before for about 3 days now. I did resubmit the updated sitemap to GWT and I'm showing no crawl errors. Also, curious if my Robots.txt file could be the issue. All it contains is User-agent: *
Disallow: /wp-admin/ Any insight or advise is greatly appreciated!
Thanks for your time0 -
Transferring a site to wordpress and its effect on SEO
I have a site that is hosted on a very old platform and would like to move it to WordPress. We are a surf shop and the summer months are the busiest for us. I want like to make sure that if I transfer my site that it won't hurt our rankings in any way. What would the best way of doing this be? i.e start building the WordPress site and once it is finished, point the domain to it? Will this have any ramifications for our rank? Thank you!
Technical SEO | | FierceFrame0 -
Friendly URLS (SEO urls)
Hello, I own a eCommerce site with more than 5k of products, urls of products are : www.site.com/index.php?route=product/product&path=61_87&product_id=266 Im thinking about make it friend to seo site.com/category/product-brand Here is my question,will I lost ranks for make that change? Its very important to me know it Thank you very much!
Technical SEO | | matiw0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
What is the most SEO friendly shopping cart for wordpress?
I'm using Thesis Theme. I know load time is important and will continue to be a bigger deal as time goes on. Any advice would be much appreciated!!
Technical SEO | | chadmorgan0 -
Switching Hosting & SEO
Hello friends, We are facing the prospect of switching to a new hosting account or company. We are currently using a third-party reseller account but are outgrowing that account. We are considering VPS and dedicated servers. However, this will mean updates for IPs and nameservers. Does anyone have experience with SEO consequences of making switch? Best practices? Tips? Obstacles? Any and all comments/advice welcome. We're trying to balance the potential SEO ramifications of making the switch with the consequences of reduced site speed.
Technical SEO | | Gyi0 -
SEO tips for RSS feeds?
What SEO advice do you have for RSS feeds? Specifically, does the URL structure matter? Should the be noindex, follow or noindex, follow? Any other advice?
Technical SEO | | nicole.healthline0