PageSpeed Vs Page Size
-
Hi,
We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements.
However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages.
However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom.
My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective.
Please suggest. Here is my homepage, just as to give you an idea of what i am talking about:
-
Thanks Cyrus and Max,
Very good answer and I am going to work as per your suggestions
-
As Max said, from a ranking perspective, Time to First Byte seems to be the most important factor. The same author of that post offered some tips to improving time to first byte: http://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Oftentimes, you simply have a lot of assets to load and it's difficult to cut anything back. In these cases, the order that things load becomes increasingly important for user experience (asynchronous java script, for example).
Regardless, doing everything you can to improve speed and checking with Google Page Speed Insights is usually the best advice. I've never, ever seen a website where improving speed performance didn't help with traffic metrics (wether rankings or engagement) so I believe it's an investment worth making.
-
What google really cares about is the TTFB (Time To First Byte), to check it just head for GWT, in crawl stats.
To date the general consensus is above 1s is bad and google could penalize you, below .5s is good and google could improve your ranking a little bit.
Google suggest using webpagetest to check a website performance: if you run the test for your website you will se the TTFB is not that bad: http://www.webpagetest.org/result/141124_MF_14DY/
Your overall load time is 10s and I agree is too much, it's supposedly worse your user experience, increasing your bounce rate and alienating some of your visitors. You should work to improve it, webpagetest suggest to compress images and use leverage browser cache, which are good suggestions.
Analyze closely the waterfall to investigate further and identify other areas of interventions.
-
Hi there,
I think it would improve page load if the youtube video was the last to load.
Hope it helps you.
-
You are right! Which is why I dont want to compromise on usability. Thanks for your response
-
give it some time! It should be ok. The main issue with speed should be if the users are fine with it. Think of people before SEO and you ll be fine!
-
Thanks for your response, but the images are possibly as optimized as they could be. I use ImageOptim for Mac to optimize them, they are all jpegs (stripped from all metadata) and enabled for (mild) lossy to WebP on supported browsers.
Do you feel there might be anything else that I could do?
-
Am sure you could work on the optimization a bit more, especially of the images.
none the less if you require the same structure and you are unable to change the size then I would not worry so much about it. Having a fast website is only one of the hundred of different factors that affect SEO. Just work on the other factors and it will be fine!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Wordpress Config Thoughts: Multisite vs. Parent/Child Themes vs. Infinite WP?
We publish four local food and drink magazines, each with its own website and related web content. Even though the content across all four titles shares a common mission, there is little overlap in actual stories. That is, each site has its own story content, events calendar and business listing guide. Still, since we share an editorial staff and a common look among all four, we are debating the pros and cons of a few different wordpress and SEO configurations, and would welcome the community's input on the pros and cons. Here is what we are considering for the Wordpress configuration: Wordpress Multisite - concerns about 10-15% performance hit, incompatibility with certain plug ins, need to more ‘expert’ development InfiniteWP - concerns that adding a 3rd party plugin to the mix might complicate things Parent / child themes A single wordpress site with different content subfolders for each locale - simplifies events / guide listings / seo, but too much in one place? Problems with current config (four different wordpress installs across four different base domains - ediblemanhattan.com, ediblebrooklyn.com, ediblelongisland.com, etc) SEO value is currently spread across four base domains Four different wordpress installs / upgrades / templates / plugins must be managed separately Four different namespaces for registered users make cross-domain registration more difficult, less usable The independent site approach is potentially problematic if we were to decide to combine certain site features - for example guide and event listings - into a single site experience filterable by zip / location Our questions: WP config: independent sites vs. multisite vs. parent/child themes vs. other? SEO config: should we move to shared parent domain? If we do, should we use locale-based subfolders or second level domains (brooklyn.ediblemag.com vs. ediblemag.com/brooklyn)? Operations: We think there are SEO advantages to move all four sites share the same base domain - ex, ediblemagazine.com, but are there operational disadvantages we are not considering? Ability for local site editors to work within their locale section only Ability for ad sales to target a single locale, example, run of site display ads on specific locales Ability to segment users by their locale - ex. enroll users in email lists for edible brooklyn only
Intermediate & Advanced SEO | | brianhalweil0 -
Page and Domain Authority
How much Page and Domain Authority we need to look for to secure a backlink.
Intermediate & Advanced SEO | | ross254sidney0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
How to build links to landing pages?
I have been using link baits like infographics to get quality links to my site and I have observed that these tactics are great to get links to the home page or that particular post page where infographic was originally posted. But we have various other important landing pages and we want to transfer some link equity to those pages. Whenever we publish an infographic we post it on out blog with an embed code carrying anchor text pointed to our site’s home page. People who share our infographic, normally links to the home page or to the post page where they find that particular item. So, what are the possible ways to get links to any other landing page? Can we post some bait on other landing pages as well. I need to know some more techniques to attract deep links. Thanks
Intermediate & Advanced SEO | | shaz_lhr1 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Sudden drop in page rank
Hi One of our client websites has had a sudden drop in rankings, and also in page rank. We have obviously been off for 2 weeks, so havent been blogging daily etc as we normally would have. Would this cause such a drop? Some keywords have lost from position 2/3 to page 7 over night, and we havent change the strategy! Thanks,
Intermediate & Advanced SEO | | SEOwins0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640