Massive Spam attack against my domain - automate disvow of tld?
-
We've been getting hundreds of new links from unique domains every day - all the domains follow a pattern like this:
www.someword-1f4163e1.space/wiki/Someterm
Hundreds... every day. What techniques exist to deal with a prolonged negative seo attack of this type.
By the time we can detect and disvow, the damage is done.
-
Annoyingly the disavow tool does not support complex matching. If you're after wildcard or regex matching for your disavow uploads, that's something you won't be able to get your hands on. It is a shame because, coordinated network link-bombardment really has no simple 1-click solution for webmasters right now (that's pretty poor!)
You'd have to build something more complex which connects with the API of the tool which detects all of these links. It would have to have its own database and be programatically capable of updating that database. You'd need it to filter out all of the domains which don't match your pattern (and come up with regex / SQL queries for matching that exact pattern in a robust, reliable manner). It would have to de-dupe existing / new entries and then generate a text file for you. It would also have to be capable of comparing the file it generates against your existing file, so it doesn't lose your manual-mode disavows
To me, it sounds like a lot of trouble to go to. I'd make a post about it on Google's forum here: https://productforums.google.com/forum/#!forum/webmasters - try to attract the attention of someone from Google and let them know that, these kinds of attacks do happen and you want the Disavow Tool (as a Google product) to properly allow people to defend themselves.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with links to your domain that the previous owner set up
Hey everyone, I rebranded my company at the end of last year from a name that was fairly unique but sounded like I cleaned headstones instead of building websites. I opted for a name that I liked, it reflected my heritage - however it also seems to be quite common. Anyway, I registered the domain name as it was available as the previous owner's company had been wound up. It's only been in the last week or two where I've managed to have a website on that domain and I've been tracking it's progress through Moz, Google & Bing Webmaster tools. Both the webmaster tools are reporting back that my site triggers 404 errors for some specific links. However, I don't have or have never used those links before. I think the previous owner might have created the links before he went bust. My question is in two parts. The first part is how do I find out what websites are linking to me with these broken URL's, and the second is will these 404'ing links affect my SEO? Thanks!
White Hat / Black Hat SEO | | mickburkesnr0 -
Secondary Domain Outranking Master Website
IEEE is a large professional association dedicated to serving engineers. The IEEE Web Presence is made up of flagship sites like IEEE.org, IEEEXplore, and IEEE Spectrum, mid-tier sites like Computer.org, and smaller sites like those dedicated to specific conferences. It is unclear exactly when this started - but searches in Google for [ieee] currently return ieeeusa.org before ieee.org. This is troublesome, as users are typically looking for IEEE.org with such a general query. ieeeusa.org is a site that has a much narrower focus - it is dedicated to public policy. IEEE.org is one of the strongest domains - I am thinking that this is a glitch of some sort. I am removing a stale sitemap that is referenced in robots.txt (though again, I'm not seeing any issues with other pages - its just two queries that are trouble: [ieee] and [about ieee]. And its noticeable in analytics 🙂 http://ieee.d.pr/hMg0/YhklCw7Z What do you think? 🙂
White Hat / Black Hat SEO | | thegrif3290 -
Dump Penguin Hit Domain
Just wanting to get some feedback from others dealing with Penguin hits on client's websites. We've got one particularly client that has been hit badly because of a high proportion of link toxicity. After running the Cemper Detox Tool we found that only about 25 links are healthy. We're actually thinking of dumping the domain and moving the website to a new domain and starting again with link building (manually grabbing as many of the existing healthy links as possible on the way). Has anyone out there used this strategy? What do you think of the potential of the Sandbox of the new site vs. the Penguin hit on the old site. Do you think the 'drag' of Penguin is higher than the 'drag' of the Sandbox on rankings? Thanks guys, look forward to your insight!
White Hat / Black Hat SEO | | mavster0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
Is our sub-domain messing up our seo for our root?
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain. The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain. How do these errors affect the root domain, and how do you propose we address the issue?
White Hat / Black Hat SEO | | opusbyseo0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760