I need help on web page load time, its very bad!
-
Note: This is KILLING my customer experience.
Here is my webpage: http://www.stbands.com
Here is a speed test that may help you (look at the poor ratings in the upper corner)
http://www.webpagetest.org/result/110628_MW_Y8CQ/1/details/
I have an F on "Cache Static Content" - anyone know how I can fix this?
Also, it is a e-commerce website hosted through core commmerce. I have some access to code but not all of it. Some of it is dynamic. However, if you tell me specific things I can forward it to their very awesome tech department. They are very willing to work with me and are now considering implementing a CDN after I schooled them.
Any help is greatly appreciated. Don't be afraid to get very technical - I may not understand it, but the engineers there will.
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
You might also want to try to Google speed plugin, http://code.google.com/speed/page-speed/, it identify issues and will give fix suggestions that you can pass to your tech guys/gals.
-
My load time went down by 50%. Non eCommerce though. I would say yes. However I would definitely look into all the variables with eCommerce. Should be great though.
-
My site uses SSL, which means I have to pay $20 / month. I need to make sure it is going to be worth it before I do that. Is it?
-
Sure, there are two areas to address on your server config. One is the cache-control which is returned in the header. I set a longer period of cache for all images and scripts to save users downloading new copies time and again to their cache. ie: max-age=3600, must-revalidate - see more here: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
Then I would also set some rules around using 304 status on page furniture and other assets which do not change frequently.
Aside from this, as far as I am aware, you should ensure your stack is optimised. The recent Search Insight session from Google was interesting as in that presentation they talked a lot about the average load and latency times they see, useful to use as a benchmark in tuning your own speed.
Cheers,
Damien
-
A lot of items on your home page are getting served securely... which isn't necessary, and will prevent the browser from caching them properly. For example:
- https://www.stbands.com/javascript/jquery/jquery.min.js
- https://www.stbands.com/css/dynamic-css.php?currentlyActivePageId=1
- https://www.stbands.com/uploads/image/Custom Wristbands(1).jpg
- https://www.stbands.com/images/categories/783.jpg
- https://www.stbands.com/images/categories/785.jpg
- https://www.stbands.com/images/categories/786.jpg
- https://www.stbands.com/images/categories/787.jpg
- https://www.stbands.com/images/categories/788.jpg
Since it's not a secure page, I wouldn't be serving all of these securely. I'd use http:// instead of https://
-
I am a huge fan of Cloudflare!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
SEO question: Need help on rel="alternate" hreflang="x"
Hi all, we have webcontent in 3 languages (official belgian yellow pages), we use a separate domain per language, these are also our brands.
Technical SEO | | TruvoDirectories
ex. for the restaurant Wagamamahttp://www.goudengids.be/wagamama-antwerpen-2018/ corresponds to nl-be
http://www.pagesdor.be/wagamama-antwerpen-2018/ corresponds to fr-be
http://www.pagesdor.be/wagamama-antwerpen-2018/ corresponds to en-be The trouble is that sometimes I see the incorrect urls appearing when doing a search in google, ex. when searching on google.be (dutch=nederlands=nl-be) I see the www.pagesdor.be version appearing (french) I was trying to find a fix for this within https://support.google.com/webmasters/answer/189077?hl=nl , but this only seems to apply to websites which use SUBdomains for language purposes. I'm not sure if can work for DOMAINS. Can anyone help me out? Kind regards0 -
Need advice badly domain rank loss
I have a main domain and a sub domain that both were doing very well and after looking at the Moz history its obvious that panda and penguin affected my site considerably. My keywords are holding rank but my search results and traffic are way down. My question is if I just take my subdomain (which is really my bread an butter shopping cart) and I consolidate my sub domain with my main domain would this help? I would basically make my main domain the shopping cart and make them one site. This should change the links to all the sites. The problem is I will lose the backlinks if I do not do a 301 but I do not want to bring bad links over and my keywords are holding rank. What to do?
Technical SEO | | megapixall0 -
How can I change the page title "two" (artigos/page/2.html) in each category ?
I have some categories and photo galleries that have more than one page (i.e.: http://www.buffetdomicilio.com/category/artigos and http://www.buffetdomicilio.com/category/artigos/page/2). I think that I must change the tittle and description, but I don't how. I would like to know how can I change the title of each of them without stay with duplicate title and description. Thank you! ahcAORR.jpg
Technical SEO | | otimizador20130 -
Two of Pages Have Been SendBoxed
Hello, I was number 1-2 for my local keyword term, but now im nowhere, those two urls dont even show up in Google search results, my other pages DO, so that is obvious Google sendboxed them, i dont remember doing aggressive non quality link building, and its not a competitive term, since i was number 1 in Google for over 3 months or so i checked this tool and found that two of my urls are in sendbox http://www.searchenginegenie.com/sandbox-checker.htm I was never sendboxed before, can you help me how can i get out of this, since its my client's website, and i have to get those pages up as soon as possible Thank You
Technical SEO | | tonyklu0 -
Help with Places Pages
How can we get our Google Place page to rank higher, and how can we then keep it there instead of seeing it bounce around? We seem to have trouble getting a decent ranking for our places page even though out website ranks well on Google for geographical phrases?
Technical SEO | | onlinechester0 -
RegEx help needed for robots.txt potential conflict
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both: Allow: /*?p= and Disallow: /?p=& I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number? I've looked at several resources and there is practically no reference to what "&" does... Can anyone shed any light on this, to ensure I am allowing suitable access to a shop? Thanks in advance for any assistance
Technical SEO | | MSTJames0