Why would an image that's much smaller than our Website header logo be taking so long to load?
-
When I check http://www.ccisolutions.com at Pingdom, we have a tiny graphic that is taking much longer to load than other graphics that are much bigger. Can anyone shed some light on why this might be happening and what can be done to fix it?
Thanks in advance!
Dana
-
Thanks so much Alan for this great response. While I am not as technically savvy as you and Jason, I knew that I shouldn't 100% rely on Pingdom either, so I am very familiar with the other tools you mentioned and use them routinely.
Since my hands are tied as I have no access to either server or source code. as I mentioned to Jason, I will be taking these suggestions to our IT Director to see how far I can get in addressing these issues.
I am on the PageSpeed warpath, and really appreciate your generous response.
I'll let you know what happens!
Dana
-
Thanks so much Jason,
This is great information. As I do not have access to the server or source code, I am going to take your response, in addition to Alan's to our IT Director and see what kind of actions we can take.
It's a bit of a relief to know that the images aren't our biggest problem.
Your comment about 304's is very timely because last week I was scouring through server log files and noticed quite a few 304's. You've pretty much answered my question on why I found so many of those.
These are all the pains of self-hosting with insufficient staff and know-how to set things up properly. Hoepfully, we can get by with a little help from our friends.
Thanks so much!
Dana
-
All great info so far. Let me add some considerations.
CSS images - 16 - total file size - 455,806
Quite often a site references images in CSS files that aren't even displayed on some, most or nearly all pages. They're baked into the CSS style sheet used across part or all of the site.
When this happens, Google crawls all of those images regardless of whether they're displayed. They do so because it's one of their goals to "discover all the content you have". Because of that, their crawler has no choice but to make extra calls to the server for every image referenced.
So every call to the server adds to the page speed that matters most to Google rankings. As a result, if a review of those images shows they are not needed on key pages of the site, consider having a different style sheet created for those pages that doesn't include them in the CSS.
Also, while Pingdom helps to detect possible bottlenecks (I use it solely for this reason) it is NOT a valid representation of potential page speed problems as far as Google's system is concerned. The reason is the Pingdom system does not process a page's content the way the Google system does. So even if Google Analytics reports a page speed of 15 seconds, Pingdom will routinely report a speed a tiny fraction of that.
While not ideal, I always rely on URIValet.com and WebPageTest.org (the '1st run test, not the "2nd run, because that caches processing) to do my evaluation comparisons.
Where I DO use Pingdom, is when I enter in a URL (be sure to set the test server to a U.S. server, not their European server), when the test has been run, I click over to the "Page Analysis" tab. That breaks down possible bottleneck points in file types, process types, and even domains (if you have 3rd party service widgets or code that's a big issue sometimes and this will show the possible problem sources).
For example, for your home page, that report shows 73% of even that system's own time was processing images. And it also shows six domain sources, with 94.49% of the process time coming from your own domain.
Note an interesting thing though - that report also shows 63% of the time was due to "connect" time - meaning more than half of even Pingdom's process was sucked up just connecting wwhich helps reaffirm the notion that if Google has to make many requests of your server, each request has to connect and thus it can add to overall speed.
-
Hey Dana,
Smooshing images is always a best practice, but in your case, I tool a peek at your homepage and your images aren't that poorly optimized. In your case image optimization is going to save you 30K of 176K in images on your homepage. (I still wouldn't discourage you from setting up automated image optimization such as smoosh).
Your bigger performance problems are that you aren't using gzip on your CSS or JS files. Turning on GZip for your .css and .js files would save you 110K out of 236K in text files.
By far the biggest thing you could do to speed up your user experience would be to set a reasonable browser cache for all your static assets. You're website has many assets that are used on every page the visitsor sees (like all the stuff in your header, footer, and nav). The browswer should download those files the first time the visitor hists and pages, and then when they go to every other page, the browser should know it's OK to use the local copy rather than going back to the server to see if their is a newer version. But because their is no browser cache set, the browser is obligated to check with the server every time. In most cases the browser will get an error 304 error when it asks for the same file again (error 304 means the asset hasn't changed since the last time you ask), so the browser uses the local copy, but all that hand-shaking takes time that you could save if you set browser cache times for all your asset.
GZip is #3 on the SEO Tips article you found, Browser Caching is #1, and those are the two things that are costing your particular homepage the most page performance issues.
-Jason
-
Thanks Charles,
Your comments made me curious for more information because I am sooooo not a graphics person. You sent me in the right direction and I appreciate that. I also found this post here at SeoMoz: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Looks like we have some smooshing to do!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier. Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified' I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions? Thanks in advance.
Technical SEO | | RoxBrock1 -
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Pro's & contra's: http vs https
Hi there, We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better for ranking (in the future). We have a large e-commerce site. A part of this site ia already HTTPS. I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
Technical SEO | | Leonie-Kramer
But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc? I want to make a list form pro's and contra's and things we have to do in advance. Thanx, Leonie0 -
How to Remove /feed URLs from Google's Index
Hey everyone, I have an issue with RSS /feed URLs being indexed by Google for some of our Wordpress sites. Have a look at this Google query, and click to show omitted search results. You'll see we have 500+ /feed URLs indexed by Google, for our many category pages/etc. Here is one of the example URLs: http://www.howdesign.com/design-creativity/fonts-typography/letterforms/attachment/gilhelveticatrade/feed/. Based on this content/code of the XML page, it looks like Wordpress is generating these: <generator>http://wordpress.org/?v=3.5.2</generator> Any idea how to get them out of Google's index without 301 redirecting them? We need the Wordpress-generated RSS feeds to work for various uses. My first two thoughts are trying to work with our Development team to see if we can get a "noindex" meta robots tag on the pages, by they are dynamically-generated pages...so I'm not sure if that will be possible. Or, perhaps we can add a "feed" paramater to GWT "URL Parameters" section...but I don't want to limit Google from crawling these again...I figure I need Google to crawl them and see some code that says to get the pages out of their index...and THEN not crawl the pages anymore. I don't think the "Remove URL" feature in GWT will work, since that tool only removes URLs from the search results, not the actual Google index. FWIW, this site is using the Yoast plugin. We set every page type to "noindex" except for the homepage, Posts, Pages and Categories. We have other sites on Yoast that do not have any /feed URLs indexed by Google at all. Side note, the /robots.txt file was previously blocking crawling of the /feed URLs on this site, which is why you'll see that note in the Google SERPs when you click on the query link given in the first paragraph.
Technical SEO | | M_D_Golden_Peak0 -
Should I change by URL's
I started with a static website and then moved to Wordpress. At the time I had a few hundred pages and wanted to keep the same URL structure so I use a plugin that adds .html to every page. Should I change the structure to a more common URL structure and do 301 directs from the .html page to the regular page?
Technical SEO | | JillB20130 -
Moz Reporting Incorrect 404's
Hi Guys SEOMoz is telling me that we have 191 404 errors f. I have checked this with several other crawlers and this not the case. For example, http://www.opticalexpress.co.uk/eyecare/corporate-savings.html%0D%0A2027 But correct links its http://www.opticalexpress.co.uk/eyecare/corporate-savings.html which is fine... We have no record of these links so why is it appending these characters at the end of the URL which is causing the 404's....
Technical SEO | | EwanFisher0 -
How to remove entire directory off Google's Cache
The old version of Webmaster tools used to allow you to select whether to remove a single page from index or an entire directory.
Technical SEO | | vpahwa
http://www.canig.com/pageimages/submitremovalrequest.jpg How can I do this with the new Webmaster Tools? I can't find the option to remove an entire directory.0 -
Panda or Penquin -Website Fell - Shouldn't this Recover?
On March 23rd our site fell 47% in one day. www.TranslationSoftware4u.com but we still held quite a few #1 to #7 rankings on Google and thought it would just recover. Our top keyword "translation software" was #4 , now we are #19 Over the next week I waited to see if it recovered. We have been online 10+ years and always stayed with white hat. I admit to learning as I go over the years but always felt content was king so I focused on information. I really do not see my site as using spam techniques but maybe I am missing something on the way I have it. March 23rd, major drop -47% On April 2nd I started with SEO MOZ and the Research tools showed we had duplicate content warning. This was from a blog we were trying to start that only had 7 posts but it had about 20 tags per post. I did not realize that tags actually created that post under that tag. I went in and deleted the tags again being stupid and not realizing it was then making that come up 404. The blog was so small we do not get hits on it anyway so hoping it just clears itself up. ( still get duplicate warning on our directory due to using "php Link Directory", but it's due to how it reuses the title tag and description, 2 instances per category page"). Still trying to fix the php directory issue. Seems many others are running it and did not have a drop. April 24th, we dropped another -10% It keeps falling -70% now. I have gone through the site and tried to clean up any warnings like duplicate title tags, meta descriptions. With regards to links I put up a small web directory with some reciprocal linking. Our product translates languages but software is not the same as a human so we often set clients up with human translators, the directory is a nice place to help our customers find a translator or see online tools that can help. The links were not excessive, there were maybe 100 links. After the fall I went in and found some translators had gone out of business so I deleted those, I am down to 65 links now, about 45 are exchanges. I have submitted to some online directories manually, but looking back through the links there is not really anything that makes me concerned. The link back to my site was really the most neglected SEO thing I did. Again concentrating on content. I did find a few links that I was not happy about but I did not put those links so had no control. I have been working on cleaning up my title tags, and making sure the content just reads better. I have been hoping that my site would just start recovering but it keeps sliding. Has anyone seen recovery from the updates. Should I see anything yet? I cannot seem to get Google to return to the site and reindex. Am I doing somethign spammy on my site and I do not realize it? Thanks for any advice in advance!
Technical SEO | | Force70