Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
-
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links.
Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there.
The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com".

Thanks in advance for any assistance!
-
Hey, this is Russ here at Moz.
Do the redirects point to the homepage or to the current URL? For example, does the http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html redirect to http://newsite.com or http://newsite.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
If it does redirect to the same URL on newsite.com, I would try using wildcard robots.txt entries to simply block the offending content altogether. For example, if all the spam is off the styless.asp page, you could simply block styless.asp?* in your robots.txt and prevent Google from ever crawling those spammy links.
However, if you are redirecting everything to the homepage, I think you will need to go back to the old webmaster and figure something out. While Google is great at detecting spam, once you are under a penalty it can be difficult to recover. No one is perfect, including Google, and you don't want to be one of their "mistakes".
-
Hi usDragons,
Having too many crawl errors is not healthy. Usually a few number of pages are deleted every now and then, but having hundreds or thousands of 404s means something is wrong with the website, and from your description it's obvious that something is wrong. In fact, redirecting unnatural/thin content pages to your website can harm it, as its in a way links that send traffic (through 301 redirects) to your website, so you need to disavow these.
Because you have no control over the website, you should treat it as an external site that is spamming you. So don't think of it as a site that you own but have no access to.
The disavow tool requires you to create a .TXT file that have an explanation of why you disavow each group of domains/links. So you should explain that these are bad links that send you traffic, and you tried to "request" deleting these links and you got no help from whoever controls it, which i guess is true in your case.
Try to explain everything in your comments (in the .TXT file) (See attached)
Good luck and I hope I could help in anyway.
-
Thanks. We've been through this bad link cleanup process before, but not this kind of situation. Some advice I read said Google doesn't care about those 404s because it's obviously unrelated spam, but I would think having so many crawl errors can't be healthy for the site and I don't like the idea of redirecting them to the new site.
Now the trick is we don't have control of the old site, so we can't verify it in Google Search Console. The old site is just a redirect to the current site, so there is no website to work with. Looks like the disavow tool wants you to select a website property, but we can only use the new domain name. Will the disavow tool understand that these bad links to the old domain are redirected to the new domain name?
-
usDragons, the best way to deal with these links is to use Google's Disavow Links tool to disavow them.
First, you need to identify all of the links, and you an do that by downloading all your links from Open Site Explorer, Majestic.com, ahrefs.com, and Google Search Console. Combine the lists and remove the duplicates.
You'll want to manually review all of them, make a list of the ones you want Google to ignore, then upload a list of the domain names using Google's disavow links tool. Google has more info about their disavow tool here: https://support.google.com/webmasters/answer/2648487?hl=en
-
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
- Compile a list of links and domains.
- Contact Webmasters of these domains, requesting removal of links (include the pages where these links are added in your email)
- Save all your sent and received emails to/from Webmasters
- Ones that don't reply to you, email them one more time a couple of weeks later
- Create a disavow file for domains that you couldn't get links removed from, state the reason and dates of emails.
- Submit the disavow file to the disavow tools
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection chain and Javascript Redirect
Hi, A redirection chain is usually defined as a page redirecting to another page which itself is another redirection. URL1 ---(301/302)---> URL2 ---(301/302)---> URL3 But what about Javascript redirect? They seem to be a different beast: URL1 ---(301/302)---> URL2 ---(200 then Javascript redirect)---> URL3 From what I know if the javascript redirect is instant Google counts it as a 301 permanent redirection, but I'm still not sure about if this counts as a redirection chain. Most of the tools (such as moz) only see the first redirection. So is that scenario a redirection chain or no?
Technical SEO | | LouisPortier0 -
Delete old blog posts after 301 redirects to new pages?
Hi Moz Community, I've recently created several new pages on my site using much of the same copy from blog posts on the same topics (we did this for design flexibility and a few other reasons). The blogs and pages aren't exactly identical, as the new pages have much more content, but I don't think there's a point to having both and I don't want to have duplicate content, so we've used 301 redirects from the old blog posts to the new pages of the same topic. My question is: can I go ahead and delete the old blog posts? (Or would there be any reasons I shouldn't delete them?) I'm guessing with the 301 redirects, all will be well in the world and I can just delete the old posts, but I wanted to triple check to make sure. Thanks so much for your feedback, I really appreciate it!
Technical SEO | | TaraLP1 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
How long will Google take to stop crawling an old URL once it has been 301 redirected
I need to do a clean-up old urls that have been redirected in sitemap and was wondering about this.
Technical SEO | | Ant-8080 -
Is link cloaking bad?
I have a couple of affiliate gaming sites and have been cloaking the links, the reason I do this is to stop have so many external links on my sites. In the robot.txt I tell the bots not to index my cloaked links. Is this bad, or doesnt it really matter? Thanks for your help.
Technical SEO | | jwdesign0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0