Reduce TTFB?
-
I'm clocking up nearly 1 second on time to first byte on a Joomla 2.5 site.
It is on shared hosting so don't think there's much option for improvement from the hardware or core configuration angle.
There isn't much in the way of plug-ins and it is using SSD, the problem seems to be with the CMS itself.
Any ideas how I could produce the TTFB?
Thanks
-
When researching TTFB slowness for WORDPRESS I always start here:
https://varvy.com/pagespeed/ttfb.html
It's a great illustration on things you can control (beyond server) and I use it as a checklist ... I always seem to forget something simple.... like an edit in .htaccess.
-
Hello, there are a few options you can consider. First, you can consider getting a dedicated or semi-dedicated server so that you are not sharing resources with dozens of other websites. Second, as already mentioned a CDN such as CloudFlare is highly recommended.
Also, you can read some insight from CloudFlare about TTFB in that it is not entirely an important metric on it's own. You can enable gzip compression which will make your site load faster overall, but this may increase your TTFB.
Depending on your Joomla theme, there may be caching solutions available which can greatly benefit the load times. I think you might be using Rocket Theme so check out RokBooster extension.
Your site has 11 external JavaScripts. You should look into minifying and combining them to 1 file if possible.
I hope these recommendations help. Good luck speeding up your site.
-
Why would a content delivery network be a last resort? Cloudflare is something you can setup within under half an hour.
I would look at your host, as a "shared server" isn't exactly top tier hosting and Joomla is quite a heavy application.
-
As a last resort, yes. But I would sooner fix the problem from the ground up.
Thanks
-
Have you thought about using a CDN such as cloudflare?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems? Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site. Thanks in advance for your input!
Intermediate & Advanced SEO | | McTaggart0 -
Reducing onpage links - manufacture list
Hi Guys, I am relaunching one of my sites, and one of the categories has 746 links on it due to a list of boating manufactures. Ideally I need to cut this down - anyone got any tips on how I can do this without losing the user experience but still allowing google to crawl all the manufactures? Cheers
Intermediate & Advanced SEO | | Sayers0 -
Reducing Booking Engine Indexation
Hi Mozzers, I am working on a site with a very useful room booking engine. Helpful as it may be, all the variations (2 bedrooms, 3 bedrooms, room with a view, etc, etc,) are indexed by Google. Section 13 on Search Pagination in Dr. Pete's great post on Panda http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world speaks to our issue, but I was wondering since 2 (!) years have gone by, if there are any additional solutions y'all might recommend. We want to cut down on the duplicate titles and content and get the useful but not useful for SERPs online booking pages out of the index. Any thoughts? Thanks for your help.
Intermediate & Advanced SEO | | Leverage_Marketing0 -
Bing Disappearance Act - SERPs gone/extremely reduced for no apparent reason
Around November 23rd/25th - SERPs on Bing largely disappeared for my website. I did a site relaunch with new optimized content and proper redirects on November 12th. How can I tell if my site has been blocked by Bing? How long does a sitemap take to be indexed by Bing? Is this is a normal practice for sites that seem to have massive amounts of new content on Bing? Funny thing is things have only gotten better on Google which I know is unrelated but it's funny how Bing makes things so easy yet so difficult! Would appreciate any thoughts, help, etc. on my Bing disappearing act. domain: www.laptopmd.com
Intermediate & Advanced SEO | | LMDNYC0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0