Disavow links and domain of SPAM links
-
Hi,
I have a big problem. For the past month, my company website has been scrape by hackers.
This is how they do it:
1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS.
2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites.
3. Pages created with title, keywords and description which consists of my company brand name.
4. Using http-referrer to redirect google search results to competitor sites.
What I have done currently:
1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts.
2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week.
Problem now is:
When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google.
Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google.
Question:
1. What is the best way now moving forward for me to resolve this?
2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow?
3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues?
Note: SEAGM is company branded keyword
-
I'm afraid there's no easy answer. The security side is beyond the scope of Q&A (it's just too dependent on your platform/host/etc.), but locking that down is definitely the biggest and first step. Obviously, though, you can't stop third-party sites from getting hacked.
Disavow can be done at the domain level. There are some oddities, like Wordpress.com (where sub-domains act more like stand-alone domains), but for most sites, if most links are malicious, lock down the entire incoming domain.
Make sure your core links are clean. If you have a solid base of links, and you're not dealing with a lot of quality issues, it's tough for these kinds of hacked links to cause as much harm. Google knows this happens. Unfortunately, if your core link profile is a mess or week, then it's a lot easier to take damage. So, this is a battle on two fronts - stop the attack and, at the same time, clean up your core link profile and strengthen it as best you can.
There are a lot of link removal tools now, but honestly, they're a starting point. You need to dig in and evaluate what they give you, so that you're not taking out links that are potentially good. Right now, this is a labor-intensive process, I'm afraid.
-
Hi Andy,
Am currently gathering data from Webmaster Tools.
No, I didn't get any manual actions message from Google.
I do have a list. Am trying to use Kerboo (LinkRisk) to manage it. However, I have little time to do this.
-
Hi,
2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow?
Yes, I would be disavowing at a domain level (not even subdomain) with a view to blocking everything you find.
How have you been gathering link data? Webmaster Tools? Ahrefs? Majestic? OSE?
Ideally you need to create one master list of everything you can find and start from there. It isn't going to be a quick fix though because if you have been caught by Penguin, you wont get out of any penalty until it is re-run again. All you can do is prepare for when that run happens.
If you haven't yet been caught by Penguin, then you would be saving yourself a lot of worry by getting this resolved before the next refresh happens.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilingual Sitewide Links
Multilingual links in the footer section is being counted as backlink and we are getting tons of backlinks from all the 7 lingual websites. Is there a solution where we eliminate these links and still having the option to navigate to other lingual pages? vr24NAv
Technical SEO | | comfortclick0 -
How should we handle re-directory links? Should we remove these links?
We are currently cleaning up bad links that were purchased by a previous SEO agency. We have found links on anonym.to pages that redirect traffic to our site automatically. How should this be handled? Should we remove these links?
Technical SEO | | Lorne_Marr0 -
Links in a Flash document
How do I tell if a link in a Flash document is follow or nofollow? Or doesn't it matter? (I just found out that my company placed an advertorial in a Flash publication and I want to make sure it doesn't wind up as a paid, followed link.) Thank you!
Technical SEO | | Linda-Vassily0 -
We just recently moved site domains, and I tried to set up a new campaign for the new root domain, but it threw an error?
It threw an error saying we cannot access the SERPs of this site? Any reason why? It is an https:// site instead of the http://, but even our older domain had an https://
Technical SEO | | josh1230 -
Domain Forwarding / Multiple Domain Names / or Rebuild Blogs on them
I am considering forwarding 3 very aged and valuable domain names to my main site. There were once over 100 blog posts on each blog and each one has a page authority of 45 and domain authority of 37. My question is should i put up three blogs on the domains and link them to my site or should i just forward the domains to my main site? Which will provide me with more value. I have the capability to have some one blog on them every day. However, i do not have access to any of the old blog posts. I guess i could scrape it of archive.org. Any advice would be appreciated. Scott
Technical SEO | | WindshieldGuy-2762210 -
I think I have a penalty on my domain...
my domain is www.brighttights.com it is an affiliate marketing website in the niche of tights and lingerie. A few months back my traffic was pretty good, doing about 500 hits a day from product search terms only. After the panda updates I blocked all the product pages from google as they were duplicate content and I am now working on a program of seing for the category and homepages instead. I am using much more generic, and high volume, keywords for these. Several months later I seem to not only be down to 7 people a day on my website but i'm not even ranking for terms such as "bright tights". I used to be no1 for this. I have domain authority of 27 so it's not terrible, competitors on the first page range from 45 to 9. This lack of ranking for the sites name/domain name term is leading me to wonder if I have a penalty on the site. Any feedback would be gratefully received.
Technical SEO | | Grumpy_Carl0 -
What should I set my domain setting to?
In Google Wemnaster tools, I have the option to set it to either have as default the "www" or without it. What are the pros and cons of one way or the other . . . or is this a way more complicated question/can of worms I have opened?
Technical SEO | | damon12120 -
Redirecting root domains to sub domains
Mozzers: We have a instance where a client is looking to 301 a www.example.com to www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean
Technical SEO | | Yozzer0