Can ht access file affect page load times
-
We have a large and old site. As we've transition from one CMS to another, there's been a need for create 301 redirects using our ht access file.
I'm not a technical SEO person, but concerned that the size of our ht access file might be contributing source for long page download times.
Can large ht access files cause slow page load times? Or is the coding of the 301 redirect a cause for slow page downloads?
Thanks
-
It definitely can, I saw once a site that had hundreds of redirects and it was definitely slowing the site down as they had to run through all the redirects first on the server side to see where the user should be forwarded to.
-
I also have my issues dealing with the too many hats you need in the digital business
-
Yes. My brain has been in local SEO mode this week, and my first question should have been to ask about site size. If the file is huge, then it's a huge file. Huge and fast seldom go hand in hand.
-
Hi Austin,
The server needs to process the .htacess file to resolve those redirects so if there are a lot of them it would take more time, and latency will be increased. Don't you agree?
-
How many pages are you redirecting?
-
Yes, huge .htaccess files will delay your page load.
A couple of tips:
- File size: keep the number of rules to a minimum by using regular expressions.
- Execution: if it is applicable make .htaccess smarter by using the L tag to stop the process once a rule is matched.
Hope it helps!
-
The ultimate source for testing what affects page load in terms of assets is here: http://tools.pingdom.com/fpt/
The size of the htaccess file shouldn't really affect a page's load speed. With a 301 redirect, if someone goes to the new page, I don't see why it would load more slowly. If you went to the old page, you're adding on the time of the redirect. Since hopefully the old page shouldn't be rendering anything, it should be near instant.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Can you use a seperate url for a interior product page on a site?
I have a friend that has a health insurance agency site. He wants to add a new page, for child health care insurance to his existing site. But the issue is, he brought a new URL; insurancemykidnow.com and he want's to use it for the new page. Now, I'm not sure I'm right on this, but I don't think that can be done? I'm I wrong? = Thanks in advance.
Technical SEO | | Coppell0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Should I Remove Thousands of Bad Links over a Short Time or Long Time?
Hey Moz Community! I've got a website that has hundreds of thousands of old links that don't really offer any great content. They need to be removed. Would it be a better idea to remove them in batches of 5000,10000, or more over a long time... or remove them all at the same time because it doesn't matter? Cheers, Alex
Technical SEO | | Anti-Alex0 -
How can I make Google Webmaster Tools see the robots.txt file when I am doing a .htacces redirec?
We are moving a site to a new domain. I have setup an .htaccess file and it is working fine. My problem is that Google Webmaster tools now says it cannot access the robots.txt file on the old site. How can I make it still see the robots.txt file when the .htaccess is doing a full site redirect? .htaccess currently has: Options +FollowSymLinks -MultiViews
Technical SEO | | RalphinAZ
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(www.)?michaelswilderhr.com$ [NC]
RewriteRule ^ http://www.s2esolutions.com/ [R=301,L] Google webmaster tools is reporting: Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.0 -
Submitting Sitemap File vs Sitemap Index File
Is it better to submit all sitemap files contained in a Sitemap Index File manually to Google or is it about the same as just submitting the Master Sitemap Index File.
Technical SEO | | AU-SEO0 -
Backlinks to home page vs internal page
Hello, What is the point of getting a large amount of backlinks to internal pages of an ecommerce site? Although it would be great to make your articles (for example) strong, isn't it more important to build up the strength of the home page. All of My SEO has had a long term goal of strengthening the home page, with just enough backlinks to internal pages to have balance, which is happening naturally. The home page of our main site is what comes up on tons of our keyword searches since it is so strong. Please let me know why so much effort is put into getting backlinks to internal pages. Thank you,
Technical SEO | | BobGW0 -
Can Search engines crawl this page
Hi guys, To put a long story short we have had to make a copy of our site and put it on another domain so in essence there are 2 copies of our site on the web.What we have done is put a username and password on the homepage - http://www.ughhwiki.co.uk/ now i just want to be 100% sure that the search engines cannot crawl this? Thank you Jon
Technical SEO | | imrubbish0