Good technical parameters worst load time.
-
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up?
This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
-
MY page is a basic html page. I have already rewrited code, there are a handful of dom elemnts, css files sprited etc. PAge load time went from 230 milisec to 500 when I implemented the new features.
-
I think this should be the case, my page is a basic html at around 200kb. Thanks for the answer.
-
It takes time to compress and decompress a page, for a litwieght page compression can actualy take longer.
If you have a heavy page then compression can be a good thing, but if your page is light then it can work against you.
On a windows server you can tell it how big a file has to be before it is compressed, The default is 256b. Thats should tell you somthing
You can also cache the compressed your static files so that compression is not needed the next time.
-
Can you share the load time that you got before and after working on those technical parameters? In our website, we usually use the webmaster tools and at the same time compare it to our competitor. For example we are in the hotel business so we try to compare our site performance to the biggest hotel chains.
But then in my opinion once you worked on those technical parameters, there are still other aspect of your site that you need to check to increase the load performance.
1. Check the size of your page. Initially, our site loads around 10 secs and this is because of our layout and we use a lot of images. First step is we compress all our jpgs without degrading the quality of it. The second part is a reconstruct the layout or our scripts to reduce the DOM elements in our site. I notice a difference on the load time when our DOM elements is less than 1000
2. For sprite images. It will be better to create a sprite image upon development instead of spriting them on the fly which is why I think they said that it is a waste of resources. I use this site for spriting our images. http://spritegen.website-performance.org/
3. You need to minify and combined all your css and javascript.
I also follow all those rules in YSLOW and Page Speed and I can see a significant improvement in our page load time. Without using a CDN our site now loads around 4-5 secs and with CDN, I think around 3-4 secs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
Are correcting missing meta descrption tags a good use of time?
My modest website (shew-design.com) has pulled up nearly sixty crawl errors. Almost all of them are missing meta description tags. One friend who knows SEO better than me says that adding meta tags to EVERY page is not a good use of time. My site is available at shew-design.com I'm just getting started in being serious about applying SEO to our site and I want to make sure I'm making the best use of my time. The other error I'm getting are duplicate page names within different directories (e.g. getting started (for branding), getting started (for web). Is this a huge priority? Would welcome your feedback.
Technical SEO | | Eric_Shew0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
JS loading blocker
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
Technical SEO | | MickEdwards0 -
Content Aggregator Services....good or bad for seo?
I've just had a demo from a content aggregator service called NewsCred. Essentially the service licenses the use of content from multiple sources on our own site. They claim that as the content is all properly referenced back to the original sites there are no SEO implication to the host site. Are they correct? Should we stay away?
Technical SEO | | J_Sinclair0 -
Templates for Meta Description, Good or Bad?
Hello, We have a website where users can browse photos of different categories. For each photo we are using a meta description template such as: Are you looking for a nice and cool photo? [Photo name] is the photo which might be of interest to you. And in the keywords tags we are using: [Photo name] photos, [Photo name] free photos, [Photo name] best photos. I'm wondering, is this any safe method? it's very difficult to write a manual description when you have 3,000+ photos in the database. Thanks!
Technical SEO | | TheSEOGuy10 -
Technical question about site structure using a CMS, redirects, and canonical tag
I have a couple of sites using a particular CMS that creates all of the pages under a content folder, including the home page. So the url is www.example.com/content/default.asp. There is a default.asp in the root directory that redirects to the default page in the content folder using a response.redirect statement and it’s considered a 302 redirect. So all incoming urls, i.e. www.example.com and example.com and www.example.com/ will go to the default.asp which then redirects to www.example.com/ content/default.asp. How does this affect SEO? Should the redirect be a 301? And whether it’s a 301 or a 302, can we have a rel=canonical tag on the page that that is rel=www.example.com? Or does that create some sort of loop? I’ve inherited several sites that use this CMS and need to figure out the best way to handle it.
Technical SEO | | CHutchins1 -
On page optimisation: Good for the users and engines?
I would like to rank for words as:
Technical SEO | | madsurfer
windsurfing equipment
windsurfing news
windsurfing sails
windsurfing boards etc. Now am I wondering if I should use exact those words in the navigation/titles/descriptions because it seems not user friendly. The whole website is about windsurfing thus naming it just “equipement” instead of “windsurfing equipment” would be clear to a visitor that I am talking about that windsurfing related topic. Here is an example: http://madwindsurfing.com/cat/competitions-events/
I can even change the URL to http://madwindsurfing.com/cat/windsurfing-competitions-events/ What would be the best way of choosing the naming/descriptions when I do on-page optimisation which is good for the engines and for the users and who would you do in my case?0