Long term plan for a large htaccess file with 301 redirects
-
We setup a pretty large htaccess file in February for a site that involved over 2,000 lines of 301 redirects from old product url's to new ones.
The 'old urls' still get a lot of traffic from product review sites and other pretty good sites which we can't change.
We are now trying to reduce the page load times and we're ticking all of the boxes apart from the size of the htaccess file which seems to be causing a considerable hang on load times. The file is currently 410kb big!
My question is, what should I do in terms of a long terms strategy and has anyone came across a similar problem?
At the moment I am inclined to now remove the 2,000 lines of individual redirects and put in a 'catch all' whereby anything from the old site will go to the new site homepage.
Example code:
RedirectMatch 301 /acatalog/Manbi_Womens_Ear_Muffs.html /manbi-ear-muffs.html
RedirectMatch 301 /acatalog/Manbi_Wrist_Guards.html /manbi-wrist-guards.htmlThere is no consistency between the old urls and the new ones apart from they all sit in the subfolder /acatalog/
-
When I faced a situation with several hundred pages, I decided to to only list the most important ones. I determined the important ones by there presence in Google and the import of the page content.
I first Googled "site:www.example.com" to get a good idea of what was indexed.
I used Analytics to see if any pages were entry pages. If a page gets no hits as an entry page, the 301 redirect is never needed.
I made a list of about 100 redirects, then made the 404 error page a slight variation of my homepage.
Now if you have any pages that have links in, you will need to maintain those redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
301 redirect to URL plus anchor tag???
Hi - my company has just had a site redesign completed, and our "old" site we have landing pages for a full product line. The new design has taken the content from those landing pages and placed them into one long scrolling page. We currently rank well on the "old" landing pages but now all that content is contained in a single page with anchor tags throughout attached to the headings. Can you set up 301's to anchor tags? Example: old site www.mysite.com/products/automotive/auto-parts.html new site: www.mysite.com/products/automotive#auto-parts
Algorithm Updates | | Jenny10 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
Question About : Redirecting Old Pages to New & More Relevant Ones
I'm looking over a friends website, which used to have great natural ranking for some big keywords. Those ranking & CTR's have dropped a lot, so the next thing I checked into was top selling Brand & Category pages. Its seems like every year or so a New Page was constructed for each brand... Many of which have high quality and natural inbound links. However, the pages no longer have products and simply look outdated. I'm trying to figure out if they should place redirects on all the old pages to a new URL which is more seo friendly. Example Links : http://www.xyz.com/nike2004.html , http://www.xyz.com/nike-spring2006.html , http://www.xyz.com/2011-nike-shoes.html - (have quality inbound links, bad content) .... Basically would it be advantageous to place redirects on all of these example pages to a new one that will be more permanent... http://www.xyz.com/nike-shoes.html I'm also looking at about 15 brands and maybe 100+ old/outdated urls, so I wasn't sure if I should do this & to what extent. Considering many of the brand pages do rank, but not as well as they should... Any input would help, thanks
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
I used to rank 12 or 13 for a specific search term. Now I don't show up at all. What's cuasing this?
Used to rank on the second page for a specific search query, now we don't seem to show up at all for the same query. Has Google penalized us or is something else going on?
Algorithm Updates | | wlefevre0 -
Penguin Update and Long Tail Keywords
Since the Penguin update, organic traffic for our site; oxygenconcentratorstore.com
Algorithm Updates | | chuck-layton
has dropped almost 25%. The thing I cannot figure out is that when I compare April 1<sup>st</sup> to May 15<sup>th</sup> (before the update) and June 1<sup>st</sup> to July 15<sup>th</sup> (after); 8 of our top keywords are up. Our 2 main keywords are up 15% and 11%. Where we lost most of our traffic is from our longtail keywords. We have almost 1000 longtail keyword phrases that we got visits from in the April/May period that we did not get in the June/July period. Is there a reason why our top keywords would continue to improve while we most of the longtail keywords?? If Google penalized us, won’t all of our keywords be dropping and just not the longtail. Any help/info would be awesome. Thanks.0 -
What's the best way to discover which search terms competitors are highly-ranked for?
I'd like to know for which search terms competitors appear in the top 10, but I haven't found an efficient way to do so. Any help is appreciated...thanks!
Algorithm Updates | | actionagainsthunger0