Massive Spam attack against my domain - automate disvow of tld?
-
We've been getting hundreds of new links from unique domains every day - all the domains follow a pattern like this:
www.someword-1f4163e1.space/wiki/Someterm
Hundreds... every day. What techniques exist to deal with a prolonged negative seo attack of this type.
By the time we can detect and disvow, the damage is done.
-
Annoyingly the disavow tool does not support complex matching. If you're after wildcard or regex matching for your disavow uploads, that's something you won't be able to get your hands on. It is a shame because, coordinated network link-bombardment really has no simple 1-click solution for webmasters right now (that's pretty poor!)
You'd have to build something more complex which connects with the API of the tool which detects all of these links. It would have to have its own database and be programatically capable of updating that database. You'd need it to filter out all of the domains which don't match your pattern (and come up with regex / SQL queries for matching that exact pattern in a robust, reliable manner). It would have to de-dupe existing / new entries and then generate a text file for you. It would also have to be capable of comparing the file it generates against your existing file, so it doesn't lose your manual-mode disavows
To me, it sounds like a lot of trouble to go to. I'd make a post about it on Google's forum here: https://productforums.google.com/forum/#!forum/webmasters - try to attract the attention of someone from Google and let them know that, these kinds of attacks do happen and you want the Disavow Tool (as a Google product) to properly allow people to defend themselves.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To buy or not to buy? Domains with history..
I am involved in setting up a new business which as of yet is to decide on a brand name.. As the availability of domain name in local tld (and ideally .com) is of so much importance, the brand naming process is inextricably linked to this. Therefore, upon finding a suggested name was available to register (without premium) there was a degree of satisfaction. However on looking at archive.org it was discovered that the .co.uk had been a sex/marital aid store back in 2004 and more recently in 2014 a travel blog / affiliate. Q is; Is the past history of what is a site with possible black hat links a reason to avoid registration? Or, does time cure all? And, is there a way in which domain health can be reliably confirmed? Thanks in advance for your input..
White Hat / Black Hat SEO | | seanmccauley0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
Changing domains from .net to .com after 7 month of traffic loss.
We are in business since 2005 and we always used the .net version as it was the only one available when we started. In about 2007 we bought the .com version to the person who owned it but we kept using the .net as customers were already used to that version. In January we started to see a SE traffic loss, not to mention being outranked by several sites (95% of those site spammers). We had no manual penalty but it could be an algorithmic, we are not sure if we even have some sort of penalty or is just that our niche is too spammed. We are now considering moving the site to the .com version as all our tries of increasing and regaining our ranks were useless (backlink cleanup, disavow tool usage, excellent link building, excellent content creation and social interactions). Our DA and PA are both higher that any of the other ages ranking on top. We have about 3k pages indexed. What do you guys think? Should we move the site to the .com? (note that the change is ranking-wise, not in terms of branding). And if we do, should we 301 all pages? or rel=canonical to avoid a possible "penalty flow" to the other domain? Note: for years, the .com version was/is 301 to the .net one. Thank you all!
White Hat / Black Hat SEO | | FedeEinhorn0 -
Links from automated translations can damage the source?
I've a website dataprix.net composed by automated translations in diferent languages from original contents from another website, dataprix.com. Is good for dataprix.com to be linked by the contents of dataprix.net as the source of translated content, or could be considered by Google as a lot of low quality links and result on penalties for dataprix.com?
White Hat / Black Hat SEO | | xiruca0 -
Old SPAM tactic still works and gets TOP 3 in SERP?
Hi Mozers, Below you can see some examples of spam ( hidden text and sneaky redirects) which are in SERP for our branded keywords during last 3 months. Some of them occupy very high position in SERP (top 3/top5). https://www.google.com/search?num=100&newwindow=1&safe=off&biw=1883&bih=1028&q=%22your+mac+-%22%2B%22cleanmymac%22 I sent spam reports and I’m going to continue doing so. (~500 spam reports from personal and work google account) I contacting directly with some of the hacked sites (web-masters) and tried to help them to fix this issue, but it takes a lot of my time. But 3 months!? Can you give me any advice, what doing next? Thank you!
White Hat / Black Hat SEO | | MacPaw0 -
Web virus attack every second
Hello my wordpress has been constantly attacked every day, files were uploaded and redirections were made to others websites. I instaled sucruri pluggin paying the annual fee, and no result. They keep acessing the web. And i uploading backup security. Know i have instaled OSE wp firewall and seems that they are getting more dificulty accessing and uploading files. But still sending like 40 attacks every day. Is ther any way to stop this? were is some information of the blocked attacks LOGTIME: 2013-02-22 10:58:01 FROM IP: http://whois.domaintools.com/27.153.210.183 REFERRER: http://www.propdental.com/index.php?option=com_registration&task=register LOGTIME: 2013-02-22 10:52:09 FROM IP: http://whois.domaintools.com/2a00:1d70:c01c::69:61 URI: http://www.propdental.com/video//wp-admin.php FROM IP 40 attacks this ip every two seconds: http://whois.domaintools.com/2a00:1d70:c01c::69:61 URI: http://www.propdental.com/video//wp-admin.php ACTION: Blocked LOGTIME: 2013-02-22 10:49:10 FROM IP: http://whois.domaintools.com/103.31.186.82 URI: http://www.propdental.com/ METHOD: GET LOGTIME: 2013-02-22 10:37:10 FROM IP: http://whois.domaintools.com/120.43.11.251 URI: http://www.propdental.com/blog/tag/carillas-de-porcelana-cerinate METHOD: GET USERAGENT: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.95 Safari/537.11 REFERRER: http://www.propdental.com/blog/tag/carillas-de-porcelana-cerinate ACTION: Blocked LOGTIME: 2013-02-22 10:28:52 FROM IP: http://whois.domaintools.com/36.251.43.51 URI: http://www.propdental.com/ METHOD: GET USERAGENT: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 REFERRER: http://www.buyclassybags.com/
White Hat / Black Hat SEO | | maestrosonrisas0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0