Why would an image that's much smaller than our Website header logo be taking so long to load?
-
When I check http://www.ccisolutions.com at Pingdom, we have a tiny graphic that is taking much longer to load than other graphics that are much bigger. Can anyone shed some light on why this might be happening and what can be done to fix it?
Thanks in advance!
Dana
-
Thanks so much Alan for this great response. While I am not as technically savvy as you and Jason, I knew that I shouldn't 100% rely on Pingdom either, so I am very familiar with the other tools you mentioned and use them routinely.
Since my hands are tied as I have no access to either server or source code. as I mentioned to Jason, I will be taking these suggestions to our IT Director to see how far I can get in addressing these issues.
I am on the PageSpeed warpath, and really appreciate your generous response.
I'll let you know what happens!
Dana
-
Thanks so much Jason,
This is great information. As I do not have access to the server or source code, I am going to take your response, in addition to Alan's to our IT Director and see what kind of actions we can take.
It's a bit of a relief to know that the images aren't our biggest problem.
Your comment about 304's is very timely because last week I was scouring through server log files and noticed quite a few 304's. You've pretty much answered my question on why I found so many of those.
These are all the pains of self-hosting with insufficient staff and know-how to set things up properly. Hoepfully, we can get by with a little help from our friends.
Thanks so much!
Dana
-
All great info so far. Let me add some considerations.
CSS images - 16 - total file size - 455,806
Quite often a site references images in CSS files that aren't even displayed on some, most or nearly all pages. They're baked into the CSS style sheet used across part or all of the site.
When this happens, Google crawls all of those images regardless of whether they're displayed. They do so because it's one of their goals to "discover all the content you have". Because of that, their crawler has no choice but to make extra calls to the server for every image referenced.
So every call to the server adds to the page speed that matters most to Google rankings. As a result, if a review of those images shows they are not needed on key pages of the site, consider having a different style sheet created for those pages that doesn't include them in the CSS.
Also, while Pingdom helps to detect possible bottlenecks (I use it solely for this reason) it is NOT a valid representation of potential page speed problems as far as Google's system is concerned. The reason is the Pingdom system does not process a page's content the way the Google system does. So even if Google Analytics reports a page speed of 15 seconds, Pingdom will routinely report a speed a tiny fraction of that.
While not ideal, I always rely on URIValet.com and WebPageTest.org (the '1st run test, not the "2nd run, because that caches processing) to do my evaluation comparisons.
Where I DO use Pingdom, is when I enter in a URL (be sure to set the test server to a U.S. server, not their European server), when the test has been run, I click over to the "Page Analysis" tab. That breaks down possible bottleneck points in file types, process types, and even domains (if you have 3rd party service widgets or code that's a big issue sometimes and this will show the possible problem sources).
For example, for your home page, that report shows 73% of even that system's own time was processing images. And it also shows six domain sources, with 94.49% of the process time coming from your own domain.
Note an interesting thing though - that report also shows 63% of the time was due to "connect" time - meaning more than half of even Pingdom's process was sucked up just connecting wwhich helps reaffirm the notion that if Google has to make many requests of your server, each request has to connect and thus it can add to overall speed.
-
Hey Dana,
Smooshing images is always a best practice, but in your case, I tool a peek at your homepage and your images aren't that poorly optimized. In your case image optimization is going to save you 30K of 176K in images on your homepage. (I still wouldn't discourage you from setting up automated image optimization such as smoosh).
Your bigger performance problems are that you aren't using gzip on your CSS or JS files. Turning on GZip for your .css and .js files would save you 110K out of 236K in text files.
By far the biggest thing you could do to speed up your user experience would be to set a reasonable browser cache for all your static assets. You're website has many assets that are used on every page the visitsor sees (like all the stuff in your header, footer, and nav). The browswer should download those files the first time the visitor hists and pages, and then when they go to every other page, the browser should know it's OK to use the local copy rather than going back to the server to see if their is a newer version. But because their is no browser cache set, the browser is obligated to check with the server every time. In most cases the browser will get an error 304 error when it asks for the same file again (error 304 means the asset hasn't changed since the last time you ask), so the browser uses the local copy, but all that hand-shaking takes time that you could save if you set browser cache times for all your asset.
GZip is #3 on the SEO Tips article you found, Browser Caching is #1, and those are the two things that are costing your particular homepage the most page performance issues.
-Jason
-
Thanks Charles,
Your comments made me curious for more information because I am sooooo not a graphics person. You sent me in the right direction and I appreciate that. I also found this post here at SeoMoz: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Looks like we have some smooshing to do!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?
Technical SEO | | DML-Tampa0 -
Is putting a manufacturer's product manual on my site in PDF duplicate content
I add the product manuals to our product pages to provide additional product information to our customers. Is this considered duplicate content? Is there a best way to do this so that I can offer the information to my customers without getting penalized for it? Should they be indexable? If not how do I control?
Technical SEO | | merch_zzounds0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Will syndicated content hurt a website's ranking potential?
I work with a number of independent insurance agencies across the United States. All of these agencies have setup their websites through one preferred insurance provider. The websites are customizable to a point, but the content for the entire website is mostly the same. Therefore, literally hundreds of agency sites have essentially the same content. The only thing that changes is a few "wildcards" in the copy where the agency fills in their city, state, services areas, company history, etc. My questions is: will this syndicated content hurt their ranking potential? I've been toying with the idea of further editing the content to make it more unique to an agency, but I would hate to waste a lot of hours doing this if it won't help anything. Would you expect this approach to be beneficial or a waste of time? Thank you for your help!
Technical SEO | | copyjack0 -
Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
We just picked up a new search client a few weeks ago. They've been a customer (we're an automotive dealer website provider) since October of 2011. Their content was very generic (came from the previous provider), so we did a quick once-over as soon as he signed up. Beefed up his page content, made it more unique and relevant... tweaked title tags... wrote meta descriptions (he had none). In just over a week, he went from ranking on page 4 or 5 for his terms to ranking on page 2 or 3. My team was working on getting his social media set up, set up his blog, started competitor research... And then this last weekend, something happened and he dropped completely from the rankings... He still shows up if you do a site: search, or if you search his exact business name, but for everything else, he's nowhere to be found. His URL is www.ohioautowarehouse.com, business name is "Ohio Auto Warehouse" We filed a reconsideration request on Monday, and just got a reply today that there was no manual penalty. They suggested we check our content, but we know we didn't do anything spammy or blackhat. We hadn't even fully optimized his site yet - we were just finishing up his competitor research and were planning on a full site optimization next week... so we're at a complete loss as to what happened. Also, he's not ranking for any of the vehicles in his inventory. Our vehicle pages always rank on page 1 or 2, depending on how big the city is... you can always search "year make model city" and see our customers' sites (whether they're doing SEO or not). This guy's cars aren't showing up... so we know something is going on... Any help would be a lifesaver. We've been doing this for quite some time now, and we've never had a site get penalized. Since the reconsideration request didn't help, we're not sure what to do...
Technical SEO | | Greg_Gifford0 -
How long does it take for customized Google Site Search to show results from pdf files?
The site in question is http://www.ejmh.eu I am pretty unsatisfied with the results I am getting from the Site Search provided by Google. We have over 160 pdf files in this subfolder: http://www.ejmh.eu/mellekletek The files are the digital versions of articles. When I search for content in those pdf files, Google does not show results. It does show results from older pages, dating back 1-2 years but it is certainly not showing anything from pdf files that I have just put up 3 weeks ago. My questions: If I place a Google Search on a site, does it not automatically display results from ALL the content in the root domain? Is there any correlation between how the Site Search is indexing the files and how Google is indexing the urls in general? Should I just wait and see whether site search performance improves or should I switch to another Search software like Zoom Search? It is vital to have a proper, high-quality search functioning on that site in the very near future. What are your experiences? Any tips are greatly appreciated.
Technical SEO | | Lauroca0