I need help on web page load time, its very bad!
-
Note: This is KILLING my customer experience.
Here is my webpage: http://www.stbands.com
Here is a speed test that may help you (look at the poor ratings in the upper corner)
http://www.webpagetest.org/result/110628_MW_Y8CQ/1/details/
I have an F on "Cache Static Content" - anyone know how I can fix this?
Also, it is a e-commerce website hosted through core commmerce. I have some access to code but not all of it. Some of it is dynamic. However, if you tell me specific things I can forward it to their very awesome tech department. They are very willing to work with me and are now considering implementing a CDN after I schooled them.
Any help is greatly appreciated. Don't be afraid to get very technical - I may not understand it, but the engineers there will.
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
You might also want to try to Google speed plugin, http://code.google.com/speed/page-speed/, it identify issues and will give fix suggestions that you can pass to your tech guys/gals.
-
My load time went down by 50%. Non eCommerce though. I would say yes. However I would definitely look into all the variables with eCommerce. Should be great though.
-
My site uses SSL, which means I have to pay $20 / month. I need to make sure it is going to be worth it before I do that. Is it?
-
Sure, there are two areas to address on your server config. One is the cache-control which is returned in the header. I set a longer period of cache for all images and scripts to save users downloading new copies time and again to their cache. ie: max-age=3600, must-revalidate - see more here: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
Then I would also set some rules around using 304 status on page furniture and other assets which do not change frequently.
Aside from this, as far as I am aware, you should ensure your stack is optimised. The recent Search Insight session from Google was interesting as in that presentation they talked a lot about the average load and latency times they see, useful to use as a benchmark in tuning your own speed.
Cheers,
Damien
-
A lot of items on your home page are getting served securely... which isn't necessary, and will prevent the browser from caching them properly. For example:
- https://www.stbands.com/javascript/jquery/jquery.min.js
- https://www.stbands.com/css/dynamic-css.php?currentlyActivePageId=1
- https://www.stbands.com/uploads/image/Custom Wristbands(1).jpg
- https://www.stbands.com/images/categories/783.jpg
- https://www.stbands.com/images/categories/785.jpg
- https://www.stbands.com/images/categories/786.jpg
- https://www.stbands.com/images/categories/787.jpg
- https://www.stbands.com/images/categories/788.jpg
Since it's not a secure page, I wouldn't be serving all of these securely. I'd use http:// instead of https://
-
I am a huge fan of Cloudflare!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you use multiple videos without sacrificing load times?
We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference. I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach. Thanks all, appreciate any guidance! Matt
Technical SEO | | MattWatts1 -
Why are crawlers not picking up these pages?
Hi there, I've been asked to audit a new subdomain for a travel company. It's all a bit messy, so it's going to take some time to remedy. However, one thing I couldn't understand was the low number of pages appearing in certain crawlers. The subdomain has many pages. A homepage, category pages then product pages. Unfortunately, tools like Screaming Frog and xml-sitemaps.com are only picking up 19 pages and I can't figure out why. Google has so far indexed around 90 pages - this is by no means all of them, but that's probably because of the new domain and lack of sitemap etc. After looking at the crawl results, only the homepage and category (continent pages) are showing. So all the product pages are not. for example, tours.statravel.co.uk/trip/Amsterdam_Kings_Day_(Start_London_end_London)-COCCKDM11 is not appearing in the crawl results. After reviewing the source code, I can't see anything that would prevent this page being crawled. Am I missing something? At the moment, the crawl should be picking up around 400+ product pages, but it's not picking up any. Thanks
Technical SEO | | PeaSoupDigital0 -
Check my website loading time
Kindly check my website loading time for the home page and deep pages. Do I need to make it fast or it is Okey? Website - brandstenmedia.com.au
Technical SEO | | Green.landon0 -
Page not cached
Hi there, we uploaded a page but unfortunately didn't realise it had noindex,nofollow in the meta tags. Google had cached it then decached it (i guess thats possible) it seems? now it will not cache even though the correct meta tags have been put in and we have sent links to it internally and externally. Anyone know why this page isn't being cached, the internal link to it is on the homepage and that gets cached almost every day. I even submitted it to webmaster tools to index.
Technical SEO | | pauledwards0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Website ranking went from page one to not in top 50 overnight. Help/suggestions?
One of our customer's websites initially ranked very well. For approximately 3 months it sat atop of Google for their optimized keywords. Suddenly, on November 17th, the ranking dropped and they were no longer in the top 50 for any keywords. We went through Google Webmaster tools and found no violations, so we emailed Google to see if we violated something and if they would reconsider. They responded "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google." This is a site built on WordPress, so we turned off a majority of plugins in case one was somehow affecting the site. They have an incredible amount of business partners that link their website from their partner's website menus, so they have about 15,000 links all with anchor text "insurance." (every page on partner site is seen as a different link). Think this is affecting it? Maybe Google sees it as artificial? (P.S. This has been set up this way for a while before they came on with us). The site ranks on page one of Bing and Yahoo, but nowhere in top 50 for Google. Any suggestions? Appreciate the help!
Technical SEO | | Tosten0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Page MozRank and MozTrust 0 for Home Page, Makes No Sense?
Hey Mozzers! I'm a bit confused by a site that is showing a 0 for home page MozRank and MozTrust, while its subdomain and root domain metrics look decent (relatively). I am posting images of the page metrics and subdomain metrics to show the disparity: http://i.imgur.com/3i0jq.png http://i.imgur.com/ydfme.png Is it normal to see this type of disparity? The home page has very little inbound links, but the big goose egg has me wondering if there is something else going on. Has anyone else experienced this? Or, does anyone have speculation as to why a home page would have a 0 MozRank while the subdomain metrics look much better? Thanks!
Technical SEO | | ClarityVentures0