Reduce TTFB?
-
I'm clocking up nearly 1 second on time to first byte on a Joomla 2.5 site.
It is on shared hosting so don't think there's much option for improvement from the hardware or core configuration angle.
There isn't much in the way of plug-ins and it is using SSD, the problem seems to be with the CMS itself.
Any ideas how I could produce the TTFB?
Thanks
-
When researching TTFB slowness for WORDPRESS I always start here:
https://varvy.com/pagespeed/ttfb.html
It's a great illustration on things you can control (beyond server) and I use it as a checklist ... I always seem to forget something simple.... like an edit in .htaccess.
-
Hello, there are a few options you can consider. First, you can consider getting a dedicated or semi-dedicated server so that you are not sharing resources with dozens of other websites. Second, as already mentioned a CDN such as CloudFlare is highly recommended.
Also, you can read some insight from CloudFlare about TTFB in that it is not entirely an important metric on it's own. You can enable gzip compression which will make your site load faster overall, but this may increase your TTFB.
Depending on your Joomla theme, there may be caching solutions available which can greatly benefit the load times. I think you might be using Rocket Theme so check out RokBooster extension.
Your site has 11 external JavaScripts. You should look into minifying and combining them to 1 file if possible.
I hope these recommendations help. Good luck speeding up your site.
-
Why would a content delivery network be a last resort? Cloudflare is something you can setup within under half an hour.
I would look at your host, as a "shared server" isn't exactly top tier hosting and Joomla is quite a heavy application.
-
As a last resort, yes. But I would sooner fix the problem from the ground up.
Thanks
-
Have you thought about using a CDN such as cloudflare?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Reduced Bounce Rate, Increased Pages/Session, Increased Session Duration-RESULT IN BETTER RANKING?
Our relaunched website has a much lower bounce rate (66% before, now 58%) increased pages per session (1.89 before, now 3.47) and increased session duration (1:33 before, now 3:47). The relaunch was December 20th. Should these improvements result in an improvement in Google rank? How about in MOZ authority? We have not significantly changed the content of the site but the UX has been greatly improved. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Is the TTFB for different locations and browsers irrelevant if you are self-hosting?
Please forgive my ignorance on this subject. I have little to no experience with the technical aspects of setting up and running a server. Here is the scenario: We are self-hosted on an Apache server. I have been on the warpath to improve page load speed since the beginning of the year. I have been on this warpath not so much for SEO, but for conversion rate optimization. I recently read the Moz Post "How Website Speed Actually Impacts Search Rankings" and was fascinated by the research regarding TTFB. I forwarded the post to my CEO, who promptly sent me back a contradictory post from Cloudflare on the same topic. Ily Grigorik published a post in Google+ that called Cloudflare's experiment "silly" and said that "TTFB absolutely does matter." I proceeded to begin gathering information on our site's TTFB using data provided by http://webpagetest.org. I documented TTFB for every location and browser in an effort to show that we needed to improve. When I presented this info to my CEO (I am in-house) and IT Director, that both shook their heads and completely dismissed the data and said it was irrelevant because it was measuring something we couldn't control. Ignorant as I am, it seems that Ilya Grigorik, Google's own Web Dev Advocate says it absolutely is something that can be controlled, or at least optimized if you know what you are doing. Can any of you super smart Mozzers help me put the words together to express that TTFB from different locations and for different browsers is something worth paying attention to? Or, perhaps they are right, and it's information I should ignore? Thanks in advance for any and all suggestions! Dana
Intermediate & Advanced SEO | | danatanseo0 -
Reducing Booking Engine Indexation
Hi Mozzers, I am working on a site with a very useful room booking engine. Helpful as it may be, all the variations (2 bedrooms, 3 bedrooms, room with a view, etc, etc,) are indexed by Google. Section 13 on Search Pagination in Dr. Pete's great post on Panda http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world speaks to our issue, but I was wondering since 2 (!) years have gone by, if there are any additional solutions y'all might recommend. We want to cut down on the duplicate titles and content and get the useful but not useful for SERPs online booking pages out of the index. Any thoughts? Thanks for your help.
Intermediate & Advanced SEO | | Leverage_Marketing0 -
How do I reduce internal links & cannibalisation from primiary navigation?
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many. This also causes cannibalization on the word towels which i would like to avoid if possible. Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
Intermediate & Advanced SEO | | Towelsrus0 -
Bing Disappearance Act - SERPs gone/extremely reduced for no apparent reason
Around November 23rd/25th - SERPs on Bing largely disappeared for my website. I did a site relaunch with new optimized content and proper redirects on November 12th. How can I tell if my site has been blocked by Bing? How long does a sitemap take to be indexed by Bing? Is this is a normal practice for sites that seem to have massive amounts of new content on Bing? Funny thing is things have only gotten better on Google which I know is unrelated but it's funny how Bing makes things so easy yet so difficult! Would appreciate any thoughts, help, etc. on my Bing disappearing act. domain: www.laptopmd.com
Intermediate & Advanced SEO | | LMDNYC0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0