Why would an image that's much smaller than our Website header logo be taking so long to load?
-
When I check http://www.ccisolutions.com at Pingdom, we have a tiny graphic that is taking much longer to load than other graphics that are much bigger. Can anyone shed some light on why this might be happening and what can be done to fix it?
Thanks in advance!
Dana
-
Thanks so much Alan for this great response. While I am not as technically savvy as you and Jason, I knew that I shouldn't 100% rely on Pingdom either, so I am very familiar with the other tools you mentioned and use them routinely.
Since my hands are tied as I have no access to either server or source code. as I mentioned to Jason, I will be taking these suggestions to our IT Director to see how far I can get in addressing these issues.
I am on the PageSpeed warpath, and really appreciate your generous response.
I'll let you know what happens!
Dana
-
Thanks so much Jason,
This is great information. As I do not have access to the server or source code, I am going to take your response, in addition to Alan's to our IT Director and see what kind of actions we can take.
It's a bit of a relief to know that the images aren't our biggest problem.
Your comment about 304's is very timely because last week I was scouring through server log files and noticed quite a few 304's. You've pretty much answered my question on why I found so many of those.
These are all the pains of self-hosting with insufficient staff and know-how to set things up properly. Hoepfully, we can get by with a little help from our friends.
Thanks so much!
Dana
-
All great info so far. Let me add some considerations.
CSS images - 16 - total file size - 455,806
Quite often a site references images in CSS files that aren't even displayed on some, most or nearly all pages. They're baked into the CSS style sheet used across part or all of the site.
When this happens, Google crawls all of those images regardless of whether they're displayed. They do so because it's one of their goals to "discover all the content you have". Because of that, their crawler has no choice but to make extra calls to the server for every image referenced.
So every call to the server adds to the page speed that matters most to Google rankings. As a result, if a review of those images shows they are not needed on key pages of the site, consider having a different style sheet created for those pages that doesn't include them in the CSS.
Also, while Pingdom helps to detect possible bottlenecks (I use it solely for this reason) it is NOT a valid representation of potential page speed problems as far as Google's system is concerned. The reason is the Pingdom system does not process a page's content the way the Google system does. So even if Google Analytics reports a page speed of 15 seconds, Pingdom will routinely report a speed a tiny fraction of that.
While not ideal, I always rely on URIValet.com and WebPageTest.org (the '1st run test, not the "2nd run, because that caches processing) to do my evaluation comparisons.
Where I DO use Pingdom, is when I enter in a URL (be sure to set the test server to a U.S. server, not their European server), when the test has been run, I click over to the "Page Analysis" tab. That breaks down possible bottleneck points in file types, process types, and even domains (if you have 3rd party service widgets or code that's a big issue sometimes and this will show the possible problem sources).
For example, for your home page, that report shows 73% of even that system's own time was processing images. And it also shows six domain sources, with 94.49% of the process time coming from your own domain.
Note an interesting thing though - that report also shows 63% of the time was due to "connect" time - meaning more than half of even Pingdom's process was sucked up just connecting wwhich helps reaffirm the notion that if Google has to make many requests of your server, each request has to connect and thus it can add to overall speed.
-
Hey Dana,
Smooshing images is always a best practice, but in your case, I tool a peek at your homepage and your images aren't that poorly optimized. In your case image optimization is going to save you 30K of 176K in images on your homepage. (I still wouldn't discourage you from setting up automated image optimization such as smoosh).
Your bigger performance problems are that you aren't using gzip on your CSS or JS files. Turning on GZip for your .css and .js files would save you 110K out of 236K in text files.
By far the biggest thing you could do to speed up your user experience would be to set a reasonable browser cache for all your static assets. You're website has many assets that are used on every page the visitsor sees (like all the stuff in your header, footer, and nav). The browswer should download those files the first time the visitor hists and pages, and then when they go to every other page, the browser should know it's OK to use the local copy rather than going back to the server to see if their is a newer version. But because their is no browser cache set, the browser is obligated to check with the server every time. In most cases the browser will get an error 304 error when it asks for the same file again (error 304 means the asset hasn't changed since the last time you ask), so the browser uses the local copy, but all that hand-shaking takes time that you could save if you set browser cache times for all your asset.
GZip is #3 on the SEO Tips article you found, Browser Caching is #1, and those are the two things that are costing your particular homepage the most page performance issues.
-Jason
-
Thanks Charles,
Your comments made me curious for more information because I am sooooo not a graphics person. You sent me in the right direction and I appreciate that. I also found this post here at SeoMoz: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Looks like we have some smooshing to do!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Yet-to-be-translated" Duplicate Content: is rel='canonical' the answer?
Hi All, We have a partially internationalized site, some pages are translated while others have yet to be translated. Right now, when a page has not yet been translated we add an English-language page at the url https://our-website/:language/page-name and add a bar for users to the top of the page that simply says "Sorry, this page has not yet been translated". This is best for our users, but unfortunately it creates duplicate content, as we re-publish our English-language content a second time under a different url. When we have untranslated (i.e. duplicate) content I believe the best thing we can do is add which points to the English page. However here's my concern: someday we _will_translate/localize these pages, and therefore someday these links will _not _have duplicate content. I'm concerned that a long time of having rel='canonical' on these urls, if we suddenly change this, that these "recently translated, no longer pointing to cannonical='english' pages" will not be indexed properly. Is this a valid concern?
Technical SEO | | VectrLabs0 -
Matt Cutts says 404 unavailable products on the 'average' ecommerce site.
If you're an ecommerce site owner, will you be changing how you deal with unavailable products as a result of the recent video from Matt Cutts? Will you be moving over to a 404 instead of leaving the pages live still? For us, as more products were becoming unavailable, I had started to worry about the impact of this on the website (bad user experience, Panda issues from bounce rates, etc.). But, having spoken to other website owners, some say it's better to leave the unavailable product pages there as this offers more value (it ranks well so attracts traffic, links to those pages, it allows you to get the product back up quickly if it unexpectedly becomes available, etc.). I guess there's many solutions, for example, using ItemAvailability schema, that might be better than a 404 (custom or not). But then, if it's showing as unavailable on the SERPS, will anyone bother clicking on it anyway...? Would be interested in your thoughts.
Technical SEO | | Coraltoes770 -
Website Down
Hello guys, My website hasn't been reachable for couple of hours today and I can't really understand why as no links have been built, all the best practices have been followed regarding on page optimization. I also checked google webmaster tools and there are no warning messages, crawl problems or anything so I don't understand why this has happened. Now for some reason the website is up and running again.
Technical SEO | | PremioOscar1 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Webmaster Tools vs Screaming from for 404's
Hey guys, I was just wondering which is better to use to find the 404's effecting your site. I have been using webmaster tools and just purchased screaming frog which has given me a totally different list of 404's compared to WMT. Which do I use, or do I use both? Cheers
Technical SEO | | Adamshowbiz0 -
Case sensitive url's
Hi, Really appreciate advice on this one in advance! We had a problem with case sensitive urls (eg: /web-jobs or /Web-jobs) We added a code to convert all urls into lowercase letters and added 301 redirection. We are now experiencing problems with duplicate page content. Each time a url contains caps letter it is converted and redirected to small letter url. I can convert all urls into lowercase letters (all places) but the problem now is google have already indexed urls so they may cause duplicate content issue. The solution: Remove 301 redirection added to convert url into small letter. Add canonical url which converts url into complete small letter, so google index content only from canonical url. But I am little confused about what will happen to already indexed pages with caps in url. Appreciate any advice you can give? Simon
Technical SEO | | simmo2350 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
What's the best free tool for checking for broken links?
I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?
Technical SEO | | CIEEwebTeam0