Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
-
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links.
Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there.
The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com".
Thanks in advance for any assistance!
-
Hey, this is Russ here at Moz.
Do the redirects point to the homepage or to the current URL? For example, does the http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html redirect to http://newsite.com or http://newsite.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
If it does redirect to the same URL on newsite.com, I would try using wildcard robots.txt entries to simply block the offending content altogether. For example, if all the spam is off the styless.asp page, you could simply block styless.asp?* in your robots.txt and prevent Google from ever crawling those spammy links.
However, if you are redirecting everything to the homepage, I think you will need to go back to the old webmaster and figure something out. While Google is great at detecting spam, once you are under a penalty it can be difficult to recover. No one is perfect, including Google, and you don't want to be one of their "mistakes".
-
Hi usDragons,
Having too many crawl errors is not healthy. Usually a few number of pages are deleted every now and then, but having hundreds or thousands of 404s means something is wrong with the website, and from your description it's obvious that something is wrong. In fact, redirecting unnatural/thin content pages to your website can harm it, as its in a way links that send traffic (through 301 redirects) to your website, so you need to disavow these.
Because you have no control over the website, you should treat it as an external site that is spamming you. So don't think of it as a site that you own but have no access to.
The disavow tool requires you to create a .TXT file that have an explanation of why you disavow each group of domains/links. So you should explain that these are bad links that send you traffic, and you tried to "request" deleting these links and you got no help from whoever controls it, which i guess is true in your case.
Try to explain everything in your comments (in the .TXT file) (See attached)
Good luck and I hope I could help in anyway.
-
Thanks. We've been through this bad link cleanup process before, but not this kind of situation. Some advice I read said Google doesn't care about those 404s because it's obviously unrelated spam, but I would think having so many crawl errors can't be healthy for the site and I don't like the idea of redirecting them to the new site.
Now the trick is we don't have control of the old site, so we can't verify it in Google Search Console. The old site is just a redirect to the current site, so there is no website to work with. Looks like the disavow tool wants you to select a website property, but we can only use the new domain name. Will the disavow tool understand that these bad links to the old domain are redirected to the new domain name?
-
usDragons, the best way to deal with these links is to use Google's Disavow Links tool to disavow them.
First, you need to identify all of the links, and you an do that by downloading all your links from Open Site Explorer, Majestic.com, ahrefs.com, and Google Search Console. Combine the lists and remove the duplicates.
You'll want to manually review all of them, make a list of the ones you want Google to ignore, then upload a list of the domain names using Google's disavow links tool. Google has more info about their disavow tool here: https://support.google.com/webmasters/answer/2648487?hl=en
-
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
- Compile a list of links and domains.
- Contact Webmasters of these domains, requesting removal of links (include the pages where these links are added in your email)
- Save all your sent and received emails to/from Webmasters
- Ones that don't reply to you, email them one more time a couple of weeks later
- Create a disavow file for domains that you couldn't get links removed from, state the reason and dates of emails.
- Submit the disavow file to the disavow tools
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having issues with Redirects not working and old links on SERP
We just migrated a site and built a redirect map for Site A to B. If there were old redirects made for site A that weren't pulled when pulling internal links for site A, do those also need to be redirected to site B to eliminate a Redirect chain? Cannot figure out why old links are still showing up, does it take a few days for google to figure out these are not real pages?
Technical SEO | | Ideas-Collide0 -
Are you Able to keep 100% traffice when do domain redirect?
Let's say you consider re-branding and purchased a new domain, however, during past 3-5 years your page has got some content, CF, TF which appears to be quite good and not easy to build within a few mths. GWT says if you do 301 redirect from the old domain then you should transfer 100% juice from the old domain as well as the juice from old external links to old domain to the new one I did some Tech Seo improvements and while ago and my experience is GWT says one but life shows slightly opposite. This is a big thing with no Emergency exit so I have got some concerns. What is your experience with that? Have you done any transfers if so what was the result? Thanks in advance for your feedback
Technical SEO | | Miniorek0 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Link building to ROOT domain OR to WWW.?
Hello, Here I come with one more 'sensitive' question, hoping that you SEO gurus could give some input on. My title explains pretty much what I'm wondering about, but let me give you some short data. I have from .htaccess file set that all traffic goes to WWW.mydomain.com. I know that it is 'better' for search engines not to have duplicate destinations as that can give decreased page rank because of 'double content'. As for search engines http://domain.com and http://www.domain.com is totally different domains. Now wondering one thing: If I build a several thousands of backlinks at various sources, blogs, directories, web sites etc etc. - shall I link to domain ROOT or shall I include WWW prefix? When looking at Moz Keyword Analysis for my domains, I can see a block about 'Linking Root Domains' and 'Page Linking Root Domains'. But no 'www' variable (sub-domain) there. As I have already set canonical part so everything shows with WWW on my website - what logic shall I use when building backlinks? How will search engine translate the link juice in regards I wrote above? Thanks in advance, great forum!
Technical SEO | | SEOisSEO0 -
Should i redirect my lost links to my home page
Hi, as some of you maybe aware, i had a major problem last year that has caused me nothing but trouble. in short, my hosting company lost me over 10,000 pages from my site and i had to rebuild the site from stratch which is still on going. I lost thousands of links to my site and i have been over the past week pointing the pages not found to the sections that is best suited to them. But i am just wondering if it would harm my site if i also point some of those links to my home page. I was a page rank four before disaster happened to my site and now i am a page rank two and i want to build this up. so i am just wondering if i should point some of those good links to my home page i am redirecting the pages using 301 in my htaccess file any advice would be great
Technical SEO | | ClaireH-1848860 -
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
Technical SEO | | MattAaron0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0 -
Buying a new domain
Hello guys! We are in process of buying a new domain. How can we be sure that this domain is not blacklisted and are there any steps to take in order to be sure that whatever we are buying is actually in "good shape"? Thanks much!
Technical SEO | | echo10