Good technical parameters worst load time.
-
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up?
This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
-
MY page is a basic html page. I have already rewrited code, there are a handful of dom elemnts, css files sprited etc. PAge load time went from 230 milisec to 500 when I implemented the new features.
-
I think this should be the case, my page is a basic html at around 200kb. Thanks for the answer.
-
It takes time to compress and decompress a page, for a litwieght page compression can actualy take longer.
If you have a heavy page then compression can be a good thing, but if your page is light then it can work against you.
On a windows server you can tell it how big a file has to be before it is compressed, The default is 256b. Thats should tell you somthing
You can also cache the compressed your static files so that compression is not needed the next time.
-
Can you share the load time that you got before and after working on those technical parameters? In our website, we usually use the webmaster tools and at the same time compare it to our competitor. For example we are in the hotel business so we try to compare our site performance to the biggest hotel chains.
But then in my opinion once you worked on those technical parameters, there are still other aspect of your site that you need to check to increase the load performance.
1. Check the size of your page. Initially, our site loads around 10 secs and this is because of our layout and we use a lot of images. First step is we compress all our jpgs without degrading the quality of it. The second part is a reconstruct the layout or our scripts to reduce the DOM elements in our site. I notice a difference on the load time when our DOM elements is less than 1000
2. For sprite images. It will be better to create a sprite image upon development instead of spriting them on the fly which is why I think they said that it is a waste of resources. I use this site for spriting our images. http://spritegen.website-performance.org/
3. You need to minify and combined all your css and javascript.
I also follow all those rules in YSLOW and Page Speed and I can see a significant improvement in our page load time. Without using a CDN our site now loads around 4-5 secs and with CDN, I think around 3-4 secs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
Google Sites website https://www.opcfitness.com/ title NOT GOOD FOR SEO
We set up a website https://www.opcfitness.com/home on google sites. but google sites page title not good for SEO. How to fix it?
Technical SEO | | ahislop5740 -
SEO URLs: 1\. URLs in my language (Greek, Greeklish or English)? 2\. Αt the end it is good to put -> .html? What is the best way to get great ranking?
Hello all, I must put URLs in my language Greek, Greeklish or in English? And at the end of url it is good to put -> .html? For exampe www.test.com/test/test-test.html ? What is the best way to get great ranking? I am a new digital marketing manager and its my first time who works with a programmer who doesn't know. I need to know as soon as possible, because they want to be "on air" tomorrow! Thank you very much for your help! Regards, Marios
Technical SEO | | marioskal0 -
Google Search Console - URL Parameters Tab ISSUE
Hi, Recently i removed some disallowed parameters from my robots.txt and added the setting No Url in my search console URL parameters tab (as can be seen in the image http://prntscr.com/e997o5) Today i saw the orderby parameter indexed even if the setting is to not crawl those urls. Anyone any idea why is this happening? Thank god that all those urls with parameters are canonicalised to their original url's.
Technical SEO | | dos06590 -
Are correcting missing meta descrption tags a good use of time?
My modest website (shew-design.com) has pulled up nearly sixty crawl errors. Almost all of them are missing meta description tags. One friend who knows SEO better than me says that adding meta tags to EVERY page is not a good use of time. My site is available at shew-design.com I'm just getting started in being serious about applying SEO to our site and I want to make sure I'm making the best use of my time. The other error I'm getting are duplicate page names within different directories (e.g. getting started (for branding), getting started (for web). Is this a huge priority? Would welcome your feedback.
Technical SEO | | Eric_Shew0 -
How does Progressive Loading, aka what Facebook does, impact proper search indexation?
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue). I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading? If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know. thanks a ton!! Janet
Technical SEO | | ACNINTERACTIVE1 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0