Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
-
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links.
Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there.
The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com".

Thanks in advance for any assistance!
-
Hey, this is Russ here at Moz.
Do the redirects point to the homepage or to the current URL? For example, does the http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html redirect to http://newsite.com or http://newsite.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
If it does redirect to the same URL on newsite.com, I would try using wildcard robots.txt entries to simply block the offending content altogether. For example, if all the spam is off the styless.asp page, you could simply block styless.asp?* in your robots.txt and prevent Google from ever crawling those spammy links.
However, if you are redirecting everything to the homepage, I think you will need to go back to the old webmaster and figure something out. While Google is great at detecting spam, once you are under a penalty it can be difficult to recover. No one is perfect, including Google, and you don't want to be one of their "mistakes".
-
Hi usDragons,
Having too many crawl errors is not healthy. Usually a few number of pages are deleted every now and then, but having hundreds or thousands of 404s means something is wrong with the website, and from your description it's obvious that something is wrong. In fact, redirecting unnatural/thin content pages to your website can harm it, as its in a way links that send traffic (through 301 redirects) to your website, so you need to disavow these.
Because you have no control over the website, you should treat it as an external site that is spamming you. So don't think of it as a site that you own but have no access to.
The disavow tool requires you to create a .TXT file that have an explanation of why you disavow each group of domains/links. So you should explain that these are bad links that send you traffic, and you tried to "request" deleting these links and you got no help from whoever controls it, which i guess is true in your case.
Try to explain everything in your comments (in the .TXT file) (See attached)
Good luck and I hope I could help in anyway.
-
Thanks. We've been through this bad link cleanup process before, but not this kind of situation. Some advice I read said Google doesn't care about those 404s because it's obviously unrelated spam, but I would think having so many crawl errors can't be healthy for the site and I don't like the idea of redirecting them to the new site.
Now the trick is we don't have control of the old site, so we can't verify it in Google Search Console. The old site is just a redirect to the current site, so there is no website to work with. Looks like the disavow tool wants you to select a website property, but we can only use the new domain name. Will the disavow tool understand that these bad links to the old domain are redirected to the new domain name?
-
usDragons, the best way to deal with these links is to use Google's Disavow Links tool to disavow them.
First, you need to identify all of the links, and you an do that by downloading all your links from Open Site Explorer, Majestic.com, ahrefs.com, and Google Search Console. Combine the lists and remove the duplicates.
You'll want to manually review all of them, make a list of the ones you want Google to ignore, then upload a list of the domain names using Google's disavow links tool. Google has more info about their disavow tool here: https://support.google.com/webmasters/answer/2648487?hl=en
-
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
- Compile a list of links and domains.
- Contact Webmasters of these domains, requesting removal of links (include the pages where these links are added in your email)
- Save all your sent and received emails to/from Webmasters
- Ones that don't reply to you, email them one more time a couple of weeks later
- Create a disavow file for domains that you couldn't get links removed from, state the reason and dates of emails.
- Submit the disavow file to the disavow tools
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Redirecting root domain to a page based on user login
We have our main URL redirecting non-logged in users to a specific page and logged in users are directed to their dashboard when going to the main URL. We find this to be the most user-friendly, however, this is all being picked up as a 302 redirect. I am trying to advise on the ideal way to accomplish this, but I am not having much luck in my search for information. I believe we are going to put a true homepage at the root domain and simply redirect logged in users as usual when they hit the URL, but I'm still concerned this will cause issues with Google and other search engines. Anyone have experience with domains that need to work in this manner? Thank you! Anna
Technical SEO | | annalytical0 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
301 Redirect How Long until the juice passes through to new site
Hi Guys, Following on from a question i asked last week in regard to a 301 http://www.seomoz.org/q/301-redirect-have-no-ranking I was thinking that i had some kind of issue on the site, although i have gone over it with a fine tooth comb i cannot find any issue's and from the amount of reads the thread has had im sure if there was something obvious it would have been pointed out. So i am quite confident the 301 from site A to site B is fine and working as intended, so my question is how long should it take until the juice is passed From site A to Site B as its 9 weeks now and still down 85% on traffic and even text for my home page if copied into the search bar don't bring up my site Bing is fine and did not see any real traffic drops but Google is not giving me back the rankings i had prior Whenever i have done a 301 before the rankings pretty steady and i see no real loss in rankings but this time ... painful all changes in WMT made
Technical SEO | | kellymandingo
Canonical tag implemented
all Pages 301 and correct 200 response from the targeted page
Sitemap Updated
Many Links Changed from Old site to new (including DMOZ)
no Robots text Blocking directory's
Google crawling freely and regularly The strange thing is New content is indexed immediately and ranks easily, I added a page for my service in my local area and went straight to position 5 in Google however old existing content wont move, I tracked 150 keywords only 4 are top 75 Don't know what else to do so any advice would be much appreciated PS site is around 17k pages Paul0 -
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
Technical SEO | | MattAaron0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0