Long term plan for a large htaccess file with 301 redirects
-
We setup a pretty large htaccess file in February for a site that involved over 2,000 lines of 301 redirects from old product url's to new ones.
The 'old urls' still get a lot of traffic from product review sites and other pretty good sites which we can't change.
We are now trying to reduce the page load times and we're ticking all of the boxes apart from the size of the htaccess file which seems to be causing a considerable hang on load times. The file is currently 410kb big!
My question is, what should I do in terms of a long terms strategy and has anyone came across a similar problem?
At the moment I am inclined to now remove the 2,000 lines of individual redirects and put in a 'catch all' whereby anything from the old site will go to the new site homepage.
Example code:
RedirectMatch 301 /acatalog/Manbi_Womens_Ear_Muffs.html /manbi-ear-muffs.html
RedirectMatch 301 /acatalog/Manbi_Wrist_Guards.html /manbi-wrist-guards.htmlThere is no consistency between the old urls and the new ones apart from they all sit in the subfolder /acatalog/
-
When I faced a situation with several hundred pages, I decided to to only list the most important ones. I determined the important ones by there presence in Google and the import of the page content.
I first Googled "site:www.example.com" to get a good idea of what was indexed.
I used Analytics to see if any pages were entry pages. If a page gets no hits as an entry page, the 301 redirect is never needed.
I made a list of about 100 redirects, then made the 404 error page a slight variation of my homepage.
Now if you have any pages that have links in, you will need to maintain those redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does google takes to crawl a single site ?
lately i have been thinking , when a crawler visits an already visited site or indexed site, whats the duration of its scanning?
Algorithm Updates | | Sam09schulz0 -
44 terms dropped out of the top 3 results on google this past week.
Can anyone take a look at the site and offer suggestions as to what I can do. My visibility went from 13% to 2% over the past two months. I had 800 organic terms prior to this now I am falling off a cliff. Have no idea what happened. I just switched over to HTTPS 3 weeks ago and it has helped a little.
Algorithm Updates | | Boodreaux1 -
Does anyone know what causes the long meta description snippet?
You know the ones I mean... Google have been infrequently displaying some meta descriptions as 3-4 lines long for some time now. But recently, I've been noticing them more. Not sure whether it's just a coincidence that I've been seeing more for my searches, or whether Google are displaying more in this format. Does anybody know what causes Google to prefer the longer meta description or extended meta description for some results?
Algorithm Updates | | Ria_0 -
Case Sensitive URL Redirects for SEO
We want to use a 301 redirect rule to redirect all pages to a lower case url format. A 301 passes along most of the link juice... most. Will we even see a negative impact in PageRank/SERPS when we redirect every single page on our site?
Algorithm Updates | | tcanders0 -
Same search term shows #1 on Bing but #140 on Google?
Hi, I am using the search term of my website domain i.e. "Series Digital" on both Bing and Google. Bing shows my website as the top most link. But on Google, my website appears on page 14!! Why is this happening when I am using the string within the " "?
Algorithm Updates | | Cloudguru990 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
How long does it take for a new website to start showing in the SERP'S
I launched my website about 6 weeks ago. It was indexed fairly quickly. But it is not showing up in the Google SERP. I did do the on page SEO and followed the best practise's for my website. I have also been checking webmaster tools and it tells me that there is no errors with my site. I also ran it through the seomoz on page seo analyzer and again no real big issues. According to seomoz I had 1 duplicate content issue with my blog posts, which i corrected. I understand it takes some time, but any ideas of how much time? And f.y.i it's a Canadian website. So it should be a lot easier to rank as well. Could my site be caught in the Google 'sandbox effect' ? Any thoughts on this would be greatly appreciated.
Algorithm Updates | | CurtCarroll0