I need help on web page load time, its very bad!
-
Note: This is KILLING my customer experience.
Here is my webpage: http://www.stbands.com
Here is a speed test that may help you (look at the poor ratings in the upper corner)
http://www.webpagetest.org/result/110628_MW_Y8CQ/1/details/
I have an F on "Cache Static Content" - anyone know how I can fix this?
Also, it is a e-commerce website hosted through core commmerce. I have some access to code but not all of it. Some of it is dynamic. However, if you tell me specific things I can forward it to their very awesome tech department. They are very willing to work with me and are now considering implementing a CDN after I schooled them.
Any help is greatly appreciated. Don't be afraid to get very technical - I may not understand it, but the engineers there will.
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)
-
You might also want to try to Google speed plugin, http://code.google.com/speed/page-speed/, it identify issues and will give fix suggestions that you can pass to your tech guys/gals.
-
My load time went down by 50%. Non eCommerce though. I would say yes. However I would definitely look into all the variables with eCommerce. Should be great though.
-
My site uses SSL, which means I have to pay $20 / month. I need to make sure it is going to be worth it before I do that. Is it?
-
Sure, there are two areas to address on your server config. One is the cache-control which is returned in the header. I set a longer period of cache for all images and scripts to save users downloading new copies time and again to their cache. ie: max-age=3600, must-revalidate - see more here: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
Then I would also set some rules around using 304 status on page furniture and other assets which do not change frequently.
Aside from this, as far as I am aware, you should ensure your stack is optimised. The recent Search Insight session from Google was interesting as in that presentation they talked a lot about the average load and latency times they see, useful to use as a benchmark in tuning your own speed.
Cheers,
Damien
-
A lot of items on your home page are getting served securely... which isn't necessary, and will prevent the browser from caching them properly. For example:
- https://www.stbands.com/javascript/jquery/jquery.min.js
- https://www.stbands.com/css/dynamic-css.php?currentlyActivePageId=1
- https://www.stbands.com/uploads/image/Custom Wristbands(1).jpg
- https://www.stbands.com/images/categories/783.jpg
- https://www.stbands.com/images/categories/785.jpg
- https://www.stbands.com/images/categories/786.jpg
- https://www.stbands.com/images/categories/787.jpg
- https://www.stbands.com/images/categories/788.jpg
Since it's not a secure page, I wouldn't be serving all of these securely. I'd use http:// instead of https://
-
I am a huge fan of Cloudflare!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Need help with rel canonical!
I have a client who's MOZ crawl is coming back with 62 "notices" about rel canonical. Is this bad? On the report, it lists the url, then "Tag Value" as the home page.....what does this mean exactly? Are they pointing all the pages to the home page? I think I have 301 and rel can confused....
Technical SEO | | cschwartzel0 -
Do I need to do on-page SEO for my mobile site?
We have a desktop site, and we just built our first mobile site. Right now, the mobile site doesn't have any title tags, meta descriptions or anything like that, but do I need to even do that? If I have all of that on the desktop site, and the mobile site is just redirected from the desktop site, can't I just do it on the desktop site only? Is there anything to gain from doing it for both sites?
Technical SEO | | KempRugeLawGroup0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
SEO Yoast Help Needed
Anyone familar with SEO Yoast and interested in being hired to check out my settings for SEO. Thinking about 30 minute screen sharing session an helping me figure out what I am am doing wrong? Just cleaned up duplicates because of tags and now I see the images are getting duplicated as well as some of the titles. So new to Wordpress here I shine. Message me if you can help. Much Appreciated!!
Technical SEO | | Force70 -
404 - page authority?
If in open site explorer my 404 pages have a higer page authority - what benefit would i see in rankings if I 301 redirected those pages to the right page. For example www.site.com/widget is a 404 but has authority according to open site explorer - but the page i see in the serps is www.site.com/widget/ with the / at the end. so what benefit would i see in rankings if I 301 redirected those pages to the right page?
Technical SEO | | DavidS-2820610 -
Duplicate exact match domains flagged by google - need help reinclusion
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly? Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend. 1. create a unique site template for each site 2. create unique content any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
Technical SEO | | ilyaelbert3