Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
High resolution (retina) images vs load time
-
I have an ecommerce website and have a product slider with 3 images.
Currently, I serve them at the native size when viewed on a desktop browser (374x374).
I would like to serve them using retina image quality (748px).
However how will this affect my ranking due to load time?
Does Google take into account image load times even though these are done asynchronously? Also as its a slider, its only the first image which needs to load. Do the other images contribute at all to the page load time?
-
"Large pictures tend to be bad for user experience."
I disagree. I think what you mean is slower loading is bad for the user experience. Higher quality pictures are better for the user experience.
I've been looking into deferring loading of the additional slider images. That should definitely improve load time as all the bandwidth can be used to download the first slider image.
Also the first slider image if you use a progressive format should show something quickly and then improve over time.
-
You also have to keep in mind that users will access your site from mobile devices and that the larger the page the longer it takes to load fully. You may lose some people during the time it takes to load the page. My website used to have a slider with three images. i removed the slider and replaced it with one static image. Large pictures tend to be bad for user experience.
-
Hey Dwayne
They are big images but from experience I have never seen a meaningful impact from these kind of changes (in around 15 years). Maybe work on optimising the images themselves as best as possible to bring the overall size down as much as possible. Sure, if your site is a slow loading nightmare and this is just the final straw then it may be an issue but by the sounds of it you are already taking that into consideration and your site is well hosted and performs better than most of everything else out there.
But, as ever in this game, my advice would be to be aware of possible implications, weigh up the pros and cons and then test extensively. If you see an impact in your loading time and search results (and more importantly in user interaction, bounce etc) after changing this one factor then you know you can roll it back.
Hope that helps
Marcus
-
Hi,
Its not that small a change...the size of each image will quadruple from around 10kb to 40kb. As there are three images thats 90kb more data. Which is around 20% of the total page size.
That's interesting what you mention about the first byte load time. I would have thought that was overly simple and would definitely have assumed Google would actually be more concerned with how long it takes for the page "to load" (e.g. using their pagespeed metrics).
I've optimized my site extensively and have pagespeed score of 95% and I use the amazon AWS servers.
I agree with your idea about doing what's right for my users. But if Google includes the image load time then my site will rank poorly and then I won't have any users!
In summary, I think what this question really comes down to is how does Google calculate page load times and does this include image load time and does it include load time for all images (even ones which aren't being rendered in the slider).
Thanks,
Dwayne
-
Hey
I think this is such a small issue overall that you should not worry about a slight increase in image sizes damaging your SEO (assuming everything else is in place).
I would ask myself the questions:
- Is this better for my site users?
- does the seriously impact load times (and therefore usability / user experience)?
If you believe it creates a better experience and does not impact loading times in a meaningful way then go for it and don't worry about a likely negligible impact on loading times.
A few things I would do:
- test average loading times with a tool like pingdom: http://tools.pingdom.com/fpt/
- replace your images and test again
- look at other areas where you can speed up loading times
- make sure your hosting does not suck
For reference there was a post here a while back re the whole loading times / SEO angle that determined it was time to first byte (response time) rather than total loading time that had the impact - this would make total loading time academic from a pure SEO perspective but... it's really not about SEO, it's about your site users and whether this makes things better (improved images) or worse (slow loading) for them.
Seriously - don't worry about this small change too much from an SEO perspective. Use it as an excuse to improve loading time as that is a good exercise for lots of reasons but go with what is right for your users.
Hope that helps
MarcusRef
http://moz.com/blog/how-website-speed-actually-impacts-search-rankinghttp://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on their own page?
Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael
Intermediate & Advanced SEO | | yaelslater1 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
If Robots.txt have blocked an Image (Image URL) but the other page which can be indexed has this image, how is the image treated?
Hi MOZers, This probably is a dumb question but I have a case where the robots.tags has an image url blocked but this image is used on a page (lets call it Page A) which can be indexed. If the image on Page A has an Alt tags, then how is this information digested by crawlers? A) would Google totally ignore the image and the ALT tags information? OR B) Google would consider the ALT tags information? I am asking this because all the images on the website are blocked by robots.txt at the moment but I would really like website crawlers to crawl the alt tags information. Chances are that I will ask the webmaster to allow indexing of images too but I would like to understand what's happening currently. Looking forward to all your responses 🙂 Malika
Intermediate & Advanced SEO | | Malika11 -
Image URLs - best practice
Hi - I'm assuming image URL best practice follows same principles as non image URLs (not too many files and so on) - I notice alot of web devs putting photos in subdomains, so wonder if I'm missing something (I usually avoid subdomains like the plague)!
Intermediate & Advanced SEO | | McTaggart1 -
Dilemma about "images" folder in robots.txt
Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah
Intermediate & Advanced SEO | | Modbargains1 -
Lowercase VS. Uppercase Canonical tags?
Hi MOZ, I was hoping that someone could help shed some light on an issue I'm having with URL structure and the canonical tag. The company I work for is a distributor of electrical products and our E-commerce site is structured so that our URL's (specifically, our product detail page URL's) include a portion (the part #) that is all uppercase (e.g: buy/OEL-Worldwide-Industries/AFW-PG-10-10). The issue is that we have just recently included a canonical tag in all of our product detail pages and the programmer that worked on this project has every canonical tag in lowercase instead of uppercase. Now, in GWT, I'm seeing over 20,000-25,000 "duplicate title tags" or "duplicate descriptions". Is this an issue? Could this issue be resolved by simply changing the canonical tag to reflect the uppercase URL's? I'm not too well versed in canonical tags and would love a little insight. Thanks!
Intermediate & Advanced SEO | | GalcoIndustrial0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
PDFs and images in Sub folder or subdomain?
What would you recommend as best practice? Our ecommerce site has a lot of PDFs supporting the product page. Currently they are kept in a sub domain and so are all images. Would it be better to keep them all in a subfolder? I've read about blogs being hosted on a subfolder to be better than subdomain but what about pdfs and images? thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0