Site Speed
-
I was wondering what benefits there are to investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this?
Thanks
-
@JustDucky We recently migrated to a data center and the average loading time dropped from ~4 seconds to ~0.9. I to noticed only 1-2% drop in bounce rate. It seems only that many people were turned off by the loading times. Then again 1-2% can be anything.
@John O'Haver I would invest the time simply because ~3.4 is the average value. This means that sometimes it goes up to 10 or even more. Take a look at your analytics account and see the performance per country. Also, I've been benchmarking analytics with remote monitoring solutions and I find a discrepancy of about 30% (probably due to limited sample date from analytics). I don't want to advertise any available solutions, but trying one won't hurt. You may find your times to be better (I hope).
-
Cypra correctly points out that faster sites make for a better user experience and Alan pointed out how inexpensive CDN can be. I installed CDN on a site that already uses WP3TC. Page load speeds cut in half but the bounce rate (which is very high) dropped by only 1 or 2%.
Has anyone who has multiple sites sampled their bounce rates before and after they installed CDN?
-
As Doug just said, there is a strong correlation between Page speed and user experience, when a user needs to wait for a page or something to load before getting the information, there is a higher bounce rate. Since the bounce rate is a strong indicator of user satisfaction that will sooner or later be implemented in algorithmic factors, it's good to adress it right from the conception phase.
-
It's not just the search engines you need to consider. Is the speed of your site affecting user experience? Are people giving up because it's just too slow? How many abandoned sessions are you getting? Do you have any opportunity to get feedback from your users?
-
Matt Cutts has said that you need to be pretty slow to incure a penalty, less than 1% of sites fall into this category.
It all depends on what is taking so long. is it download, is it slow code, is it the server?
if downloads is the problem, i would look into using a content delevery system CDN, in short hosting your images and static files in the cloud, I use Microsoft Azure Cloud services This will cost you very little in money, could be as little as a $1 a month.
You can also use this tool from google to get suggestions, but using a cdn would be the best gain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
Can site blocked for US visitors rank well internationally?
Because of regulatory reasons, a stock trading site needs to be blocked to United States visitors Since most of google datacenters seem to be located in the US, can this site rank well in the other countries where does business despite being blocked in the US? Do U.S. Google data centers influence only US rankings?
Algorithm Updates | | tabwebman0 -
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
Algorithm Updates | | Stamats0 -
Our company is mentioned on some high-traffic, authoritative sites and some of our products are linked as well. If we link to those pages, does it affect our SEO? How can we take advantage of those mentions?
I heard that if you link to another site, when Google indexes your site, they crawl that page that is referenced. By whatever metrics they use, if that site has your name or a link to your site, Google would rank it higher. I am not sure how true that is, but what value does another site mentioned our site have on our SEO?
Algorithm Updates | | JonathonOhayon1 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Any red flags associated with this site?
Hey gang, My client's keywords have recently taken a header... We've owned the top 3 spots in the SERPs for several keyword phrases for several years. In the past 3 months we've watched all those keywords and local results fade... Examples of the types of terms we were consistently ranking for included things like: Indianapolis injury lawyer Indiana accident attorneys personal injury lawyers in Indianapolis semi-truck injury attorneys and several other similar keyword phrases. Was hoping someone would be kind enough to give me a second opinion about what the cause(s) may be. The site: http://www.2keller.com/ Love and peace to all of you! 🙂 Wayne
Algorithm Updates | | Wayne760 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Site Usage Statistics and organic ranking
I'm not sure if anyone has tested this properly but i'm begining to suspect that google is using site usage statistics as a site quality guide and ultimately as a ranking variable. The this what i've seen so far on one of my sites (site A) Week 1= bounce rate (83.88%), Avg time on site (0:0:57), Pages/visit (1.28) no changes made to the site apart from the usual link building. Week 2: Traffic drops by 30%, Keywords generating traffic drops by 39%. Bounce rate (87.25%), Avg time on site (0:0:43), pages/visit (1.21). I replaced all affiliate links on my homepage to internal pages where the chunk of the content is and did a reconsideration request. Week 3: Traffic goes up by 30%, keywords generating traffic goes up by 65%, Bounce rate (30.41%), Avg time on site (0:3:02), Pages/visit (3.74). This is not the most scientific test but surely google must be using these variables and a ranking factor? Anyone seen something along these lines or have thoughts on it?
Algorithm Updates | | clickangel0