Reduce TTFB?
-
I'm clocking up nearly 1 second on time to first byte on a Joomla 2.5 site.
It is on shared hosting so don't think there's much option for improvement from the hardware or core configuration angle.
There isn't much in the way of plug-ins and it is using SSD, the problem seems to be with the CMS itself.
Any ideas how I could produce the TTFB?
Thanks
-
When researching TTFB slowness for WORDPRESS I always start here:
https://varvy.com/pagespeed/ttfb.html
It's a great illustration on things you can control (beyond server) and I use it as a checklist ... I always seem to forget something simple.... like an edit in .htaccess.
-
Hello, there are a few options you can consider. First, you can consider getting a dedicated or semi-dedicated server so that you are not sharing resources with dozens of other websites. Second, as already mentioned a CDN such as CloudFlare is highly recommended.
Also, you can read some insight from CloudFlare about TTFB in that it is not entirely an important metric on it's own. You can enable gzip compression which will make your site load faster overall, but this may increase your TTFB.
Depending on your Joomla theme, there may be caching solutions available which can greatly benefit the load times. I think you might be using Rocket Theme so check out RokBooster extension.
Your site has 11 external JavaScripts. You should look into minifying and combining them to 1 file if possible.
I hope these recommendations help. Good luck speeding up your site.
-
Why would a content delivery network be a last resort? Cloudflare is something you can setup within under half an hour.
I would look at your host, as a "shared server" isn't exactly top tier hosting and Joomla is quite a heavy application.
-
As a last resort, yes. But I would sooner fix the problem from the ground up.
Thanks
-
Have you thought about using a CDN such as cloudflare?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Will using 301 redirects to reduce duplicate content on a massive scale within a domain hurt the site?
We have a site that is suffering a duplicate content problem. To help resolve this we intend to reduce the amount of landing pages within the site. There are a HUGE amount of pages. We have identified the potential to reduce the pages by half at first by combing the top level directories, as we believe they are semantically similar enough that they no longer warrant being seperated.
Intermediate & Advanced SEO | | Silkstream
For instance: Mobile Phones & Mobile Tablets (Its not mobile devices). We want to remove this directory path and 301 these pages to the others, then rewrite the content to include both phones and tablets on the same landing page. Question: Would a massive amount of 301's (over 100,000) cause any harm to the general health of the website? Would it affect the authority? We are also considering just severing them from the site, leaving them indexed but not crawlable from the site, to try and maintain a smooth transition. We dont want traffic to tank. Has anyone performed anything similar? Id be interested to hear all opinions. Thanks!0 -
Google's Exact Match Algorithm Reduced Our Traffic!
Google's first Panda de-valued our Web store, www.audiobooksonline.com, and our traffic went from 2500 - 3000 (mostly organic referrals) per month to 800 - 1000. Google's under-valuing of our Web store continued to reduce our traffic to 400-500 for the past few months. From 4/5/2013 to 4/6/2013 our traffic dropped 50% more, because (I believe) of Google's "exact domain match" algorithm implementation. We were, even after Panda and up to 4/5/2013 getting a significant amount of organic traffic for search terms such as "audiobooks online," "audio books online," and "online audiobooks." We no longer get traffic for these generic keywords. What I don't understand is why a UK company, www.audiobooksonline.co.uk/, with a very similar domain name, ranks #5 for "audio books online" and #4 for "audiobooks online" while we've almost disappeared from Google rankings. By any measurement I am aware of, our site should rank higher than audiobooksonline.co.uk. Market Samurai reports for "audio books online" and "audiobooks online" shows that our Web store is significantly "stronger" than audiobooksonline.co.uk but they show up on Google's first page and we are down several pages. I also checked a few titles on audiobooksonline.co.uk and confirmed they are using the same publisher descriptions we and many other online book / audiobook merchants do = duplicate content. We have never received notice that our Web store was being penalized. Why would audiobooksonline.co.uk rank so much higher than audiobooksonline.com? Does Google treat non-USA sites different than USA sites?
Intermediate & Advanced SEO | | lbohen0 -
How do I reduce internal links & cannibalisation from primiary navigation?
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many. This also causes cannibalization on the word towels which i would like to avoid if possible. Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
Intermediate & Advanced SEO | | Towelsrus0 -
Do Outbound NoFollow Links Reduce the Page's Ability to Pass PageRank?
I get the recent change where adding a nofollow to one link wont increase the juice passed to other links. I'm wondering if nofollow still passes link-juice into the void. i.e. if a page has $10 of link-juice and has one link then regardless of whether this link is follow or nofollow will the page will leak the same juice? Specifically, Is this site benefitting from having a nofollow on the links in it's car buyer's checklist: http://www.trademe.co.nz/motors/used-cars/mitsubishi/diamante/auction-480341592.htm
Intermediate & Advanced SEO | | seomoz8steer0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0