Site Speed
-
I was wondering what benefits there are to investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this?
Thanks
-
@JustDucky We recently migrated to a data center and the average loading time dropped from ~4 seconds to ~0.9. I to noticed only 1-2% drop in bounce rate. It seems only that many people were turned off by the loading times. Then again 1-2% can be anything.
@John O'Haver I would invest the time simply because ~3.4 is the average value. This means that sometimes it goes up to 10 or even more. Take a look at your analytics account and see the performance per country. Also, I've been benchmarking analytics with remote monitoring solutions and I find a discrepancy of about 30% (probably due to limited sample date from analytics). I don't want to advertise any available solutions, but trying one won't hurt. You may find your times to be better (I hope).
-
Cypra correctly points out that faster sites make for a better user experience and Alan pointed out how inexpensive CDN can be. I installed CDN on a site that already uses WP3TC. Page load speeds cut in half but the bounce rate (which is very high) dropped by only 1 or 2%.
Has anyone who has multiple sites sampled their bounce rates before and after they installed CDN?
-
As Doug just said, there is a strong correlation between Page speed and user experience, when a user needs to wait for a page or something to load before getting the information, there is a higher bounce rate. Since the bounce rate is a strong indicator of user satisfaction that will sooner or later be implemented in algorithmic factors, it's good to adress it right from the conception phase.
-
It's not just the search engines you need to consider. Is the speed of your site affecting user experience? Are people giving up because it's just too slow? How many abandoned sessions are you getting? Do you have any opportunity to get feedback from your users?
-
Matt Cutts has said that you need to be pretty slow to incure a penalty, less than 1% of sites fall into this category.
It all depends on what is taking so long. is it download, is it slow code, is it the server?
if downloads is the problem, i would look into using a content delevery system CDN, in short hosting your images and static files in the cloud, I use Microsoft Azure Cloud services This will cost you very little in money, could be as little as a $1 a month.
You can also use this tool from google to get suggestions, but using a cdn would be the best gain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Embedded site on directory from other country
Dear all, With Google search console I found my site embedded on some directories from other countries, with 1000 links to my site. E.g.: http://www.lmn24.com/it/go-scoopy-2714.html My question is: should I remove my embedded site on this directories? should I remove my embedded site if these directories have good DA (domain authority)?
Algorithm Updates | | Tormar0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Site´s Architecture - Categories . What´s the best in my case?
My Dear friends of MOZ, I´ve got you a case that has been driving me crazy for 2 weeks, Im doing an SEO audit for big brand that sells electronics. Since they sell all kind of electronics, and are very popular the site is quite big and has several categories. Now...Im working particularly in a kind of micro-site that sells two kind of products that are very similar but not the same. Lets say in this site they are selling super-light-weight-Laptops and tablets, so if you look the site its a Laptop/Tablet site. But the site is not under a laptop/tablet directory, some pages are under laptop and others in Tablet directory . For example : Home page URL: /light-laptops/home.asp ; Products general page page URL is light-pads/products.asp ; and each single product page is under laptops or pads according the type of product. From my point of view, they should create a new directory called /light-laptops-pads/ and single directories for products, and case studies, etc.. Since they want to show both products together when you click in products (off course they will be creating sub-directories for the two types of products). At the begining I thought they were really mistaken, but now that I see that all light-pad content is in one folder and light-laptops content is in another, and the site jumps from one category to the other I am a little bit confused. PLEASE HELP ME PD: I want to make clear that general categories like products, case studies , contact us, solutions pages are in some cases under /light-pad/ directory and in other cases under /light-laptops / directory PLEASE PARDON MY ENGLISH!
Algorithm Updates | | facupp10 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Top resulting sites sites for a specific keyword
I'm teaching myself SEO so that I can speak more intelligently to it with my clients. I've spent a great deal of time on seomoz and love it. The more I learn, the more I realize I don't know and that brings me to my current question. I can search on a keyword and see results, however I see every URL available. I'm looking for a simple way to see the root domains for the top 100-500 resulting websites for a specific keyword. Is there an easy way to get this information I'm sure it's right in front of me, but I can't find it. Many thanks, ahossom
Algorithm Updates | | ahossom0 -
Why do in-site search result pages rank better than my product pages?
Maybe this is a common SERP for a generic product type but I'm seeing it a lot more often. Here is an example SERP "rolling stools". The top 4 results are dynamic in-site search pages from Sears, ebay and Amazon (among others). I understand their influence and authority but why would a search return a dynamic in-site SERP instead of a solid product page. A better question would be - How do I get my in-site SERPs to rank or how do I get my client's page to rise above the #5 spot is currently ranks at? Thanks
Algorithm Updates | | BenRWoodard0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0