Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hacked website - Dealing with 301 redirects and a large .htaccess file
-
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking.
How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
-
This is the correct answer.
To expand on this slightly, just make sure none of the 404s are internal (ie there are no links on your site pointing to one of these dodgy pages as a result of the hack) and you're all good.
Remove the entries from your htaccess file to avoid having to parse them constantly and let any external links to dodgy pages 404. This sort of circumstance is exactly what 404s are made for!
The only site at risk of a ranking drop from these 404s is the one pointing to those dodgy pages - who cares about your hackers' rankings?

-
So robots part could be at the end but in my case it worked fine too.
-
Just a correction here. Â I agree with all the items above, with one very, very, very, very, very important change.
DO NOT set the corrected urls to disallow in your robots.txt
If you do not allow Google to crawl the pages, Google will not see that the links were removed, that the page is now 4xx, etc. If you were to disallow all those pages, all the clean up work that you have done will not be seen by Google and would be for naught.
If you later want to disallow those pages, that would be fine, but you need to let Google see your clean up work first.
-
Hi
I just finished similar job.
What you should do:
- collect all bad "pages" and links pointing to them
- find a pattern like some kind of directory
- set them (directories I believe?) 410, not 404
- set robots to disallow those directories
- push all pages and links to reindex
- remove from Google index
- done (need to wait some time)
Important thing is to get rid of all bad links pointing to those pages. If you do that, then there'll be no issues. However this could be ongoing negseo. If you need help with that, pm me.
Krzysztof
-
If they are garbage links, why are you redirecting them? Let them 404. Having not found pages does not lead to penalties, in and of itself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects to relative URLs not absolute a problem?
Hi we recently did a migration and a lot of content changed locations see: https://d.pr/i/RvqI81 Basically, the 301 goes to the correct location but its a relative URL (as you can see from the screenshot) rather than absolute URL. Do you think this is a high priority issue from an SEO standpoint, should we get the developer to change the redirects to absolute? Cheers.
Intermediate & Advanced SEO | | cathywix0 -
Htaccess - Redirecting TAG or Category pages
Hello Fellow Moz's, We have an issue redirecting some /TAG and /Category pages to inner pages. As an example we use: RedirectMatch 301 /category/Sample-Category(.*) https://OurDomain.com.au/New-Page//$1 That works well. The issue is we have other categories and tags that are named similar to /Sample-Category As an example, if we try to redirect /Sample-Category-1 to /New-Page-1 - it will not work, and redirects to /New-Page I assume this is because /Sample-Category is already being redirected, so anything after /Sample-Category like -1 or -2 or -3 etc, will not be recognized. Anyone know of a workaround?
Intermediate & Advanced SEO | | Jes-Extender-Australia0 -
Moving html site to wordpress and 301 redirect from index.htm to index.php or just www.example.com
I found page duplicate content when using Moz crawl tool, see below. http://www.example.com
Intermediate & Advanced SEO | | gozmoz
Page Authority 40
Linking Root Domains 31
External Link Count 138
Internal Link Count 18
Status Code 200
1 duplicate http://www.example.com/index.htm
Page Authority 19
Linking Root Domains 1
External Link Count 0
Internal Link Count 15
Status Code 200
1 duplicate I have recently transfered my old html site to wordpress.
To keep the urls the same I am using a plugin which appends .htm at the end of each page. My old site home page was index.htm. I have created index.htm in wordpress as well but now there is a conflict of duplicate content. I am using latest post as my home page which is index.php Question 1.
Should I also use redirect 301 im htaccess file to transfer index.htm page authority (19) to www.example.com If yes, do I use
Redirect 301 /index.htm http://www.example.com/index.php
or
Redirect 301 /index.htm http://www.example.com Question 2
Should I change my "Home" menu link to http://www.example.com instead of http://www.example.com/index.htm that would fix the duplicate content, as indx.htm does not exist anymore. Is there a better option? Thanks0 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
301 Redirect Showing Up as Thousands Of Backlinks?
Hi Everyone, I'm currently doing quite a large back link audit on my company's website and there's one thing that's bugging me. Our website used to be split into two domains for separate areas of the business but since we have merged them together into one domain and have 301 redirected the old domain the the main one. But now, both GWT and Majestic are telling me that I've got 12,000 backlinks from that domain?  This domain didn't even have 12,000 pages when it was live and I only did specific 301 redirects (ie. for specific URL's and not an overall domain level 301 redirect) for about 50 of the URL's with all the rest being redirected to the homepage. Therefore I'm quite confused about why its showing up as so many backlinks - Old redirects I've done don't usually show as a backlink at all. UPDATE: I've got some more info on the specific back links. But now my question is - is having this many backlinks/redirects from a single domain going to be viewed negatively in Google's eyes? I'm currently doing a reconsideration request and would look to try and fix this issue if having so many backlinks from a single domain would be against Google's guidelines. Does anybody have any ideas? Probably somthing very obvious. Thanks! Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
Google Webmaster Tools is showing the following message: _Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request.  _ Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly? Here is the 301 redirect code I am using in .htaccess: RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/ RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC] RewriteRule (.*) http://www.example.com/$1 [R=301,L] Could adding the following code after that in the .htaccess potentially cause any issues? BEGIN EXPIRES <ifmodule mod_expires.c="">ExpiresActive On
Intermediate & Advanced SEO | | kimmiedawn
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule> END EXPIRES (Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Thanks0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0