Need help with image resizing (re: slow site)
-
I'm trying to figure out why I'm having speed issues with my site, and using google speed test to help me knock out some of the issues.
One of issues deals with image resizing. I have a responsive design and so even though on the home page the normal width is 580 of the blog area, the full post can go up to 1170. So I size all of my images to 1170 wide and let CSS resize them depending on the size of the browser. (The images on the most recent post are a little bigger than this because I was testing something.)
I was wondering what the best practice was in regard to what I'm trying to do.
Also feel free to check out my site and let me know of any other feedback / advice you have. Thanks !:)
-
Thanks a lot Keri,
These days with the online competition being so strong we should pay more attention to the website architecture, usability, visual impact, speed and technical problems. SEO it's so complex that you'll find yourself overwhelmed by the number of critical issues that needs to be addressed and fixed. Don't focus just on the content try to enhance every aspect of your page from to . Optimizing pictures takes only a few moments and you can use automated functions in Photoshop.
-
Another area to help with the images is to host them on a content distribution network.
Amazon is not the cheapest, but its the easiest for low volume.
A few stats:
I host about 4000 images on Amazon S3.
My bill is about 4 bucks a month.
You can put your images in a few areas (west coast vs east coast etc)....
This will help get your images closer to your audience, but it will not help you with the "last mile"
I had a customer uploading 7 MB images in Wisconsin using dial up....
can't help them...
I'm alos moving to Cloud Front, amazon Content Distribution Network...
Also, you use chrome to determine what's causing the delay.. many times, images are just part of a larger problem...
-
Hi Rick,
To the best of my knowledge, smushit compresses what it can while keeping the quality exactly the same. Saving for the web will lower the quality to "looking good on screen" from "good enough to print and hang on your wall". I also looked at the most recent post about Noah standing, and saw that the original size was 1900 pixels wide -- you certainly want to resize that to the 1170 wide before uploading it.
Being a photographer with a portfolio, Coltaire can give you a lot more details than I can, and help guide you with settings to use in Photoshop to get pictures that still look great on the web but aren't bigger than they need to be.
-
Thanks for the kind words. As I mentioned sometimes I like to do full width posts which are 1170 wide so if I use 800x600 the images won't show up correctly on full screen.
-
Rick, you have a wonderful son and the story of your website left me without words and I don't know if I can give you a good response at this moment... Try resizing them to 800x600, the size accommodates a lot of user screens / mobile traffic.
Have a wonderful day
-
Yes, i use catching. But like I said, saving it for 640 wouldn't work for me since I want image to show up bigger than that if the screen is 1170. I'm assuming the images wouldn't be able to be resized any bigger than 640 without looking stretched.
-
I never used that tool and I think it's ok to use it in some situations but you have a lot more control of the file saving for web in PS, lot more options and the quality loss is insignifiant.
Take a look at my Portfolio page. All of my files are 640x480px/72dpi/50-60quality/jpegs.
Also are you using any caching / minifing plugins?
-
I'm using smushit to make the file size smaller, but I need to be able to at least have 1170 for full width posts (like this one.) I don't think I need to use save for web if I'm using smush it do I?
Does having css resize the images cause a site to slow down a lot?
-
I think that your images are very big and are slowing down your page speed and affect your rankings. Why don't you try to scale and reduce the quality using the "Save for web" feature in PS, it's fast and you have the option to compare with the original file when saving? 800x600 , 640x480 px are large enough to be properly visualized, Think about the different screen resolutions your visitors have. I avoid using pictures larger than 100kb and my average picture quality when saved for web is 60%. Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404's and a drop in Rank - Site maps? Data Highlighter?
I managed an old (2006 design) ticket site that was hosted and run by the same company that handled our point of sale. (Think, really crappy, customer had to click through three pages to get to the tickets, etc.) In Mid February, we migrated that old site to a new, more powerful site, built by a company that handles sites exclusively for ticket brokers. (My site: TheTicketKing. - dot - com) Before migration, I set up 301's for all the pages that we had currently ranked for, and had inbound links pointing to, etc. The CMS allowed me to set every one of those landing pages up with fresh content, so I created unique content for all of them, ran them through the Moz grader before launch, etc. We launched the site in Mid February, and it seemed like Google responded well. All the pages that we had 301's set up for stayed up fairly well in rank, and some even reached higher positions, while some took a few weeks to get back up to where they were before. Google was also giving us an average of 8-10K impressions per day, compared to 3000 per day with the old site. I started to notice a slow drop in impressions in mid April (after two months of love from Google,) and we lost rank on all our non branded pages around 4/23. Our branded terms are still fine, we didn't get a message from Google, and I reached out to the company that manages our site, asking if they had any issues with their other clients. They suggested that I resubmit our sitemaps. I did, and saw everything bump back up (impressions and rank) for just one week. Now we're back in the basement with all the non branded terms once again. I realize that Google could have penalized us without giving us a message, but what got me somewhat optimistic was the fact that resubmitting our sitemaps did bring us back up for around a week. One other thing that I was working on with the site just before the drop was Google's data highlighter. I submitted a set of pages that now come back with errors, after Google seemed to be fine with the data set before I submitted it. So now I'm looking at over 300 data highlighter errors when I'm in WMT. I deleted that set, but I still get the error listings in WMT, as if Google is still trying to understand those pages. Would that have an effect on our rank? Finally I do see that our 404's have risen steadily since the migration, to over 1000 now, and the people who manage the CMS tell me that it would have no effect on rank overall. And we're going to continue to get 404's as the nature of a ticket site would dictate? (Not sure on that, but that's what I was told.) Would anyone care to chime in on these thoughts, or any other clues as to my drop?
Web Design | | Ticket_King0 -
Please help me articulate why broken pagination is bad for SEO...
Hi fellow Mozzers. I am in need of assistance. Pagination is and has been broken on the Website for which I do SEO in-house...and it's been broken for years. Here is an example: http://www.ccisolutions.com/StoreFront/category/audio-technica This category has 122 products, broken down to display 24 at a time across paginated results. However, you will notice that once you enter pagination, all of the URLs become this: http://www.ccisolutions.com/StoreFront/IAFDispatcher Even if you hit "Previous" or "Next" or your browser back button, the URL stays: http://www.ccisolutions.com/StoreFront/IAFDispatcher I have tried to explain to stakeholders that this is a lost opportunity. That if a user or Google were to find that a particular paginated result contained a unique combination of products that might be more relevant to a searcher's search than the main page in the series, Google couldn't send the searcher to that page because it didn't have a unique URL. In addition, this non-unique URL most likely is bottle-necking the flow of page authority internally because it isn't unique. This is not to mention that 38% of our traffic in Google Analytics is being reported as coming from this page...a problem because this page could be one of several hundred on the site and we have no idea which one a visitor was actually looking at. How do I articulate the magnitude of this problem for SEO? Is there a way I can easily put it in dollars and cents for a business person who really thinks SEOs are a bunch of snake oil salesmen in the first place? Does anyone have any before and after case studies or quantifiable data that they would be willing to share with me (even privately) that can help me articulate better how important it is to address this problem. Even more, what can we hope to get out of fixing it? More traffic, more revenue, higher conversions? Can anyone help me go to the mat with a solid argument as to why pagination should be addressed?
Web Design | | danatanseo0 -
Should I Use An Animated Javascript Responsive Site
Hi, hope someone might be able to help me with this. I am setting my son up with a website for his small painting and decorating company. However, I am a wordpress stalwart and he has seen a theme which is a javascript animated responsive theme from template monster. Its not my choice just he is adamant that he wants it. However, I am slightly concerned that Google cannot index as well with these kind of sites as they would with a standard HTML site. I would be grateful if someone could confirm if they can be indexed etc. The content appears in what I can only describe as lightboxes. Thanks
Web Design | | denismilton0 -
What's the point of an EU site?
Buongiorno from 18 degrees C Wetherby UK 🙂 On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment? When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do? Any ideas please,
Web Design | | Nightwing
Cioa,
David0 -
URLs appear in Google Webmaster Tools that I can't find on my own site?!?
Hi, I have a Magento e-commerce site (clothing) and when I had a look through some of the sections in Google Webmaster Tools I found URLs that I can't find on my site. For example, a product url maybe http://www.example.co.uk/product-url/ which is fine. In that product there maybe three sizes of the product (Small, Medium, Large) and for some reason Googlebot is sometimes finding a url like: http://www.example.co.uk/product-url/1202/ has been found and when clicked on is a live url (Status code: 200) with is one of the sizes (medium). However I have ran a site crawl in Screaming Frog and other crawl tests and can't seem to find where Googlebot is finding these URLs. I think I need to: 1. Find how Googlebot is finding these urls? 2. Find out how to keep out of index (e.g. robots.txt, canonical etc.... Any help would be much appreciated and I'm happy to share the URL with members if they think they can have a look and help with this problem. I can share specific URLs which might make the issue seem clearer, let me know? Thanks, Darrell
Web Design | | clickyleap0 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
Testing your code and site
I’ve got various WordPress websites with the Share This social plugin for WordPress. I have been using Firebug and http://analyze.websiteoptimization.com/wso to do general checks on the site and the code. And used W3C validator too. Due to the way WordPress appears to work we never seem to be able to get all the firebug/ website optimization tests to pass and the W3C validator passes everything on HTML 5 apart from 7 errors with the Share This social plugin. How do you test your code/websites? Should I stop be a perfectionist and just be happy?
Web Design | | JohnW-UK0 -
Best Site navigation solution
Hi there, We are getting our website redesigned and would like to know whether to increase the links on our site wide navigation or not. At the moment we have around 30 links from the navigation. We want to use exploding navigation menu and increase the links to our most important categories. Say if we increase to 60-70 would that be alright. (what will be the highest we can go for) At the moment categories that get links from navigation are ranking pretty good. If we increase would we loose those rankings. What will be the pros and cons of increasing navigation links? Second question we are also adding fooer links to top 10 categories in the footer. Would this be ok as far as seo and google concerned. Many Thanks
Web Design | | Jvalops0