Site Speed
-
I was wondering what benefits there are to investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this?
Thanks
-
@JustDucky We recently migrated to a data center and the average loading time dropped from ~4 seconds to ~0.9. I to noticed only 1-2% drop in bounce rate. It seems only that many people were turned off by the loading times. Then again 1-2% can be anything.
@John O'Haver I would invest the time simply because ~3.4 is the average value. This means that sometimes it goes up to 10 or even more. Take a look at your analytics account and see the performance per country. Also, I've been benchmarking analytics with remote monitoring solutions and I find a discrepancy of about 30% (probably due to limited sample date from analytics). I don't want to advertise any available solutions, but trying one won't hurt. You may find your times to be better (I hope).
-
Cypra correctly points out that faster sites make for a better user experience and Alan pointed out how inexpensive CDN can be. I installed CDN on a site that already uses WP3TC. Page load speeds cut in half but the bounce rate (which is very high) dropped by only 1 or 2%.
Has anyone who has multiple sites sampled their bounce rates before and after they installed CDN?
-
As Doug just said, there is a strong correlation between Page speed and user experience, when a user needs to wait for a page or something to load before getting the information, there is a higher bounce rate. Since the bounce rate is a strong indicator of user satisfaction that will sooner or later be implemented in algorithmic factors, it's good to adress it right from the conception phase.
-
It's not just the search engines you need to consider. Is the speed of your site affecting user experience? Are people giving up because it's just too slow? How many abandoned sessions are you getting? Do you have any opportunity to get feedback from your users?
-
Matt Cutts has said that you need to be pretty slow to incure a penalty, less than 1% of sites fall into this category.
It all depends on what is taking so long. is it download, is it slow code, is it the server?
if downloads is the problem, i would look into using a content delevery system CDN, in short hosting your images and static files in the cloud, I use Microsoft Azure Cloud services This will cost you very little in money, could be as little as a $1 a month.
You can also use this tool from google to get suggestions, but using a cdn would be the best gain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google penalize 2 sites for targeting "like" keyword phrases?
I own (2) different websites, one an HTML site that has been live for 20 years and a ecommerce site that has been live for 7 years. We sell custom printed (branded) tents for use at trade shows and other indoor and outdoor events. While our ecomm site targets "trade show" tents our HTML site targets "event" tents. I believe that the keyword phrases are dissimilar enough that targeting "trade show tents" on one site and "event tents" on the other should not cause Google to penalize one or the other or both sites for having similar content. The content is different on both sites. I'm wondering if anyone has experience with, or opinions on, my thoughts... either way. Thanks,
Algorithm Updates | | terry_tradeshowstuff
Terry Hepola0 -
Canonical when using others sites
Hi all, I was wondering if this is a good way to safely have content on our website. We have a job search website, and we pull content from other sites. We literally copy the full content text from it's original source, and paste it on our own site on an individual job page. On every individual job page we put a canonical link to the original source (which is not my own website). On each job page, when someone wants to apply, they are redirected to the original job source. As far as I know this should be safe. But since it's not our website we are canonical linking to, will this be a problem? To compare it was indeed.com does, they take 1 or 2 senteces from the original source and put it as an excerpt on their job category page (ie "accountant in new york" category page). When you click the excerpt/title you are redirected to the original source. As you might know, indeed.com has very good rankings, with almost no original content whatsoever. The only thing that is unique is the URL of the indeed.com category where it's on (indeed.com/accountant-new-york), and sometimes the job title. Excerpt is always duplicate from other sites. Why does this work so well? Will this be a better strategy for us to rank well?
Algorithm Updates | | mrdjdevil0 -
Does Bing Support same sitemap for full site, mobile, and images?
We have 1 sitemap for our desktop site, mobile site, and images. This works for Google, but I'm not sure if it's supported by Bing or if they require separate sitemaps. Anyone know?
Algorithm Updates | | YairSpolter0 -
Wordpress Speed Optimization Inquiry
Hello, I am curious to know everyones thoughts on speed optimization for wordpress. I currently use the w3 total cache plugin and was considering adding a CDN like cloudflare. Does anyone have any experiences with utilizing both of these two together? What works best for you?
Algorithm Updates | | WebServiceConsulting.com0 -
SEO having different effects for different sites
Hi, I hope this isn't a dumb question, but I was asked by a local company to have a look at their website and make any suggestions on how to strengthen and improve their rankings. After time spent researching their competitors, and analysing their own website I was able to determine that they are actually in a good position. The have a well structured site that follows the basic search rules, they add new relevant content regularly and are working on their social strategy. Most of their pages are rated A within Moz, and they spend a lot of time tweaking the site. When I presented this to them, they asked why there are sites that rank above them that don't seem to take as much care over their website. For example, one of their main competitors doesn't engage in any social networking, and rarely adds content to their site. I was just wondering if anyone could shed any light on why this happens? I appreciate there's probably no simple answer, but it would be great to hear some different input. Many thanks
Algorithm Updates | | dantemple880 -
Site refuses to improve rankings. Can someone else put a set of eyes on this for me and see what I am missing?
Hello! We've been successful with over 40 clients and getting them to great results in our industry, insurance. We recently acquired a new client who had an existing website with prior SEO results a very spammy blog and many spammy links. We've removed many of the blog articles and links using the Google Disavow Tool We've been monitoring this site in a campaign on Moz, but we're seeing zero improvement week to week. Can someone put another set of eyes on this and see if we're simply just missing something? Results for all 30 of our tracked keywords, zero are in the top 50! I would guess this was an algorithm penalty, but it has been 3 months now since we've made the changes and nothing is changing... not even a little bit! Any help/suggestions would be GREATLY appreciated. Thank you and enjoy Labor Day weekend!
Algorithm Updates | | Tosten0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0