PageSpeed Vs Page Size
-
Hi,
We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements.
However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages.
However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom.
My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective.
Please suggest. Here is my homepage, just as to give you an idea of what i am talking about:
-
Thanks Cyrus and Max,
Very good answer and I am going to work as per your suggestions
-
As Max said, from a ranking perspective, Time to First Byte seems to be the most important factor. The same author of that post offered some tips to improving time to first byte: http://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Oftentimes, you simply have a lot of assets to load and it's difficult to cut anything back. In these cases, the order that things load becomes increasingly important for user experience (asynchronous java script, for example).
Regardless, doing everything you can to improve speed and checking with Google Page Speed Insights is usually the best advice. I've never, ever seen a website where improving speed performance didn't help with traffic metrics (wether rankings or engagement) so I believe it's an investment worth making.
-
What google really cares about is the TTFB (Time To First Byte), to check it just head for GWT, in crawl stats.
To date the general consensus is above 1s is bad and google could penalize you, below .5s is good and google could improve your ranking a little bit.
Google suggest using webpagetest to check a website performance: if you run the test for your website you will se the TTFB is not that bad: http://www.webpagetest.org/result/141124_MF_14DY/
Your overall load time is 10s and I agree is too much, it's supposedly worse your user experience, increasing your bounce rate and alienating some of your visitors. You should work to improve it, webpagetest suggest to compress images and use leverage browser cache, which are good suggestions.
Analyze closely the waterfall to investigate further and identify other areas of interventions.
-
Hi there,
I think it would improve page load if the youtube video was the last to load.
Hope it helps you.
-
You are right! Which is why I dont want to compromise on usability. Thanks for your response
-
give it some time! It should be ok. The main issue with speed should be if the users are fine with it. Think of people before SEO and you ll be fine!
-
Thanks for your response, but the images are possibly as optimized as they could be. I use ImageOptim for Mac to optimize them, they are all jpegs (stripped from all metadata) and enabled for (mild) lossy to WebP on supported browsers.
Do you feel there might be anything else that I could do?
-
Am sure you could work on the optimization a bit more, especially of the images.
none the less if you require the same structure and you are unable to change the size then I would not worry so much about it. Having a fast website is only one of the hundred of different factors that affect SEO. Just work on the other factors and it will be fine!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Does not index in any page?
I created a website www.astrologersktantrik.com 4 days ago and fetch it with google but still my website does not index on google as the keywords I use is with low competition but still my website does not appear on any keywords?
Intermediate & Advanced SEO | | ramansaab0 -
Domain Authority... http://www.domain.com/ vs. http://domain.com vs. http://domain.com/
Hey Guys, Looking at Page Authority for my Site and ranking them in Decending Order, I see these 3 http://www.domain.com/ | Authority 62 http://domain.com | Authority 52 http://domain.com/ | Authority 52 Since the first one listed has the highest Authority, should I be using a 301 redirects on the lower ranking variations (which I understand how works) or should I be using rel="canonical" (which I don't really understand how it works) Also, if this is a problem that I should address, should we see a significant boost if fixed? Thanks ahead of time for anyone who can help a lost sailor who doesn't know how to sail and probably shouldn't have left shore in the first place. Cheers ZP!
Intermediate & Advanced SEO | | Mr_Snack0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Shopify Product Variants vs Separate Product Pages
Let's say I have 10 different models of hats, and each hat has 5 colors. I have two routes I could take: a) Make 50 separate product pages Pros: -Better usability for customer because they can shop for just masks of a specific color. We can sort our collections to only show our red hats. -Help SEO with specific kw title pages (red boston bruins hat vs boston bruins hat). Cons: -Duplicate Content: Hat model in one color will have almost identical description as the same hat in a different color (from a usability and consistency standpoint, we'd want to leave descriptions the same for identical products, switching out only the color) b) Have 10 products listed, each with 5 color variants Pros: -More elegant and organized -NO duplicate Content Cons: -Losing out on color specific search terms -Customer might look at our 'red hats' collection, but shopify will only show the 'default' image of the hat, which could be another color. That's not ideal for usability/conversions. Not sure which route to take. I'm sure other vendors must have faced this issue before. What are your thoughts?
Intermediate & Advanced SEO | | birchlore0 -
Embedded mobile page?
I have a client who wants a mobile version of their homepage. Normally, I use responsive design to accomplish this for the SEO benefit, but in this case the client wants very different information on the mobile home page than their regular home page. I don't want to go to a dedicated mobile version of the page because they get a fair amount of mobile traffic and so it would probably have a significant negative impact on their SEO to do so. So I was thinking I would add a hidden div to the home page which includes everything they want on the mobile home page and then use CSS to hide the regular content and show the hidden content if someone reaches the page from a smart phone. What do you think about this idea? Would I run afoul of Google's anti-cloaking "rules"? Has anyone done something like this before? Thanks!
Intermediate & Advanced SEO | | farlandlee0 -
What causes internal pages to have a page rank of 0 if the home page is PR 5?
The home page PageRank is 5 but every single internal page is PR 0. Things I know I need to address each page has 300 links (Menu problem). Each article has 2-3 duplicates caused from the CMS working on this now. Has anyone else had this problem before? What things should I look out for to fix this issue. All internal linking is follow there is no page rank sculpting happening on the pages.
Intermediate & Advanced SEO | | SEOBrent0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0