Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect Backlinks from Forbes,CNN.
Hello, I have seen on many places people are selling 301 Redirect Links Via Top Authority websites Like Forbes,CNN etc . How do they do it? and is it safe to have such links? I have researched a lot but not found any useful information to implement it. Any Idea how to do it? Thanks
White Hat / Black Hat SEO | | Ademirates0 -
Google says Geolocation Redirects Are Okay - is this really ok ?
Our aim is to send a user from https://abc.com/en/us to** https://abc..com/en/uk/ **if they came to our US English site from the UK So we came across this document - https://webmasters.googleblog.com/2014/05/creating-right-homepage-for-your.html We are planning to follow this in our international website based on the article by google : automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Will there be any ranking issues/ penalty issue because of following this or because of 302 redirects ? **Another article - **https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
White Hat / Black Hat SEO | | NortonSupportSEO0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Use of 301 redirects
Scenario Dynamic page produces great results for the user but produces a long very un-user and un-search friendly URL http://www.OURSITE.co.uk/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=loving&x=0&y=0#/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=lovingthebead&rh=i%3Aaps%2Ck%3Alovingthebead Solution 301 redirect in .htaccess Fantastic - works a treat BUT after redirect the original long ugly old URL appears in the location field Would really like this showing the new short user friendly short URL What am I doing wrong? Thank you all. CB
White Hat / Black Hat SEO | | GeezerG0 -
Blogspot or Wordpress.com Redirect?
I have multiple domains with the same registrar. Is there an SEO benefit to create complimentary blogs on blogspot, wordpress.com or other "free" blog sites and forward these domains with the purpose of backlinking to the main site?
White Hat / Black Hat SEO | | reeljerc0