Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hacked website - Dealing with 301 redirects and a large .htaccess file
-
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking.
How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
-
This is the correct answer.
To expand on this slightly, just make sure none of the 404s are internal (ie there are no links on your site pointing to one of these dodgy pages as a result of the hack) and you're all good.
Remove the entries from your htaccess file to avoid having to parse them constantly and let any external links to dodgy pages 404. This sort of circumstance is exactly what 404s are made for!
The only site at risk of a ranking drop from these 404s is the one pointing to those dodgy pages - who cares about your hackers' rankings?
-
So robots part could be at the end but in my case it worked fine too.
-
Just a correction here. I agree with all the items above, with one very, very, very, very, very important change.
DO NOT set the corrected urls to disallow in your robots.txt
If you do not allow Google to crawl the pages, Google will not see that the links were removed, that the page is now 4xx, etc. If you were to disallow all those pages, all the clean up work that you have done will not be seen by Google and would be for naught.
If you later want to disallow those pages, that would be fine, but you need to let Google see your clean up work first.
-
Hi
I just finished similar job.
What you should do:
- collect all bad "pages" and links pointing to them
- find a pattern like some kind of directory
- set them (directories I believe?) 410, not 404
- set robots to disallow those directories
- push all pages and links to reindex
- remove from Google index
- done (need to wait some time)
Important thing is to get rid of all bad links pointing to those pages. If you do that, then there'll be no issues. However this could be ongoing negseo. If you need help with that, pm me.
Krzysztof
-
If they are garbage links, why are you redirecting them? Let them 404. Having not found pages does not lead to penalties, in and of itself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect in breadcrumb. How bad is it?
Hi all, How bad is it to have a link in the breadcrumb that 301 redirects? We had to create some hidden category pages in our ecommerce platform bigcommerce to create a display on our category pages in a certain format. Though whilst the category page was set to not visable in bigcommerce admin the URL still showed in the live site bread crumb. SO, we set a 301 redirect on it so it didnt produce a 404. However we have lost a lot of SEO ground the past few months. could this be why? is it bad to have a 301 redirect in the breadrcrumb.
Intermediate & Advanced SEO | | oceanstorm0 -
301 redirects Ruby on Rails
Can anyone point me to the best way to implement 301 redirects on a Ruby on Rails website?
Intermediate & Advanced SEO | | brianvest0 -
Several 301 Redirects to Same Page
Hi, I have 3 Pages we won't use anymore in our website. Let's call them url A, url B and url C. To keep their SEO strength on our domain, I've though about redirecting all of them to url D. For what I understand, when 301 redirecting, about 85-90% of the link SEO juice is passed. Then, if I redirect 3 URLs to the same page... does url D receive all the link SEO juices for URLs added up? (approximately)
Intermediate & Advanced SEO | | viatrading1
e.g. future url D juice = 100% current url D juice + 85% url A juice + 85% url B juice + 85% url C juice Is this the best practice, or is there a better way? Cheers,0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Too many 301 redirects?
Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website. These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website? If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point
Intermediate & Advanced SEO | | Stew2220 -
Geoip redirection, 301 or 302?
Hello all Let me first try to explain what our company does and what it is trying to achieve. Our company has an online store, sells products for 3 different countries, and two languages for each country. Currently we have one site, which is open to all countries, what we are trying to achieve is make 3 different stores for these 3 different countries, so we can have a better control over the prices in each country. We are going to use Geoip to redirect the user to the local store in his country. The suggested new structure is to add sub-folders as following: www.example.com/ca-en
Intermediate & Advanced SEO | | ajarad
www.example.com/ca-fr
www.example.com/us-en
... If a visitor is located outside these 3 countries, then she'll be redirected to the root directory www.example.com/en We can't offer to expand our SEO team to optimize new pages for the local market, it's not the priority for now, the main objective now is to be able to control the prices for different market. so to eliminate the duplicate issue, we'll use canonical tags. Now knowing our objective from the new URL structure, I have two questions: 1- which redirect should we use? 301, 302?
If we choose 301, then which version of the site will get the link juice? (i.e, /ca-en or /us-en?)
if we choose 302, then will the link juice remain in the original links? is it healthy to use 302 for long term redirections? 2- Knowing that Google bots comes from US-IP, does that mean that the other versions of the site won't be crawled (i.e, www.example.com/ca-fr), this is especially important for us as we are using AdWords, and unindexed pages will effect our quality score badly. I'd like to know if you have other account structure in your mind that would be better than this proposed structure. Your help is highly highly appreciated.
Thanks in advance.0 -
How to set up 301 redirect for URL with question mark
I have encountered some issue with 301 redirect and htaccess file. I need to redirect the following url: http://www.domain.com/?specifications=colours/page/3 to: http://www.domain.com/colours The 301 redirect command I wrote in htaccess file is as follow: Redirect 301 /?specifications=colours/page/3 http://www.domain.com/colours And it doesn't work at the moment. What is the correct way to set up 301 redirect here? Your help will be sincerely appreciated!
Intermediate & Advanced SEO | | robotseo0 -
Old Redirecting Website Still Showing In SERPs
I have a client, a plumber, who bought another plumbing company (and that company's domain) at one point. This other company was very old and has a lot of name recognition so they created a dedicated page to this other company within their main website, and redirected the other company's old domain to that page. This has worked fine, in that this page on the main site is now #1 when you search for the other old company's name. But for some reason the old domain comes up #2 (despite the fact that it's redirecting). Now, I could understand if the redirect had only been set up recently, but I'm reasonably sure this happened about a year ago. Could it be due to the fact that there are many sites out there still linking to that old domain? Thanks in advance!
Intermediate & Advanced SEO | | VTDesignWorks1