Optimizing images and more for page speed
-
Hey everyone!
I run a comparison/affiliate site for men's clothing. On the side of that, I have a Squarespace site for inspiration, articles and outfit pictures. I've tried to optimize site speed for the Squarespace site without much success. I've run all pictures trough JPEGmini to decrease file size but it doesn't seem to be enough.
Below I attached the result I got when I run one of the pages trough Lighthouse and GTmetrix. Do you have recommendations of what I can do to improve the results? Is it a good idea to use next-gen formats for pictures as Google suggests as an example?
Kind regards,
Jonas -
Hi,
I can not see the images you attach. If you can attach the web address maybe I can help you.
Regards
-
Using next-gen formats will help, as will resizing them to their maximum size; before you upload them. Though I am surprised that you getting these issue with Squarespace as the use CDN for media storage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong page ranking on SERP, above more relevant page
Often I will see the wrong page, something less relevant to a particular search, appear higher on the SERP than a more relevant page. Why does this happen and how can it be remedied? I found this Moz article, has anything been written on this topic more recently. Thanks! https://moz.com/blog/wrong-page-ranking-in-the-results-6-common-causes-5-solutions
On-Page Optimization | | NicheSocial0 -
What to do about pages I have deleted?
I have been working through the dead links on my page and recreating the page with new content for those pages that it still makes sense to have on the site. But I have a few that were just changes of the title, spelling mistakes or other ways of saying the same thing In other words I created a page called "areas of the UK we cover" but decided to change it to "areas covered" However, I must have created links to this page and now it is a dead link with a page authority of 19 I think it would be spammy to have two pages, one called "areas covered" and the other called "areas of the UK we cover. It's not a disallow in Robots.txt because the page does not exist Please note I do not have access to the header to add code for a 301 redirect. I'm still using webs.com but not for new sites. I also have a page called singing telegrams london, that I changed from singagrams london. These are two words for the same thing but they are two very different keywords would it be ok to recreate this page and create content for singagrams london. Help is much appreciated
On-Page Optimization | | singingtelegramsuk0 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
Page Speed
Google recommends a page load speed of 1.4 seconds, is it recommended to have that page speed for every page on the site, or just the landing pages. Is there a tool that will check the load speed of every page on a site and report the slow pages? The free online tools only check one page at a time.
On-Page Optimization | | Bryan_Loconto0 -
Too many page links warning... but each link has canonical back to main page? Is my page OK?
The Moz crawl warns me many of my pages have too many links, like this page http://www.webjobz.com/jobs/industry/Accounting ...... has 269 links but many of the links are like this /jobs/jobtitles/Accounting?k=&w=3&hiddenLocationID=463170&depth=2 and are used to refine search criteria.... when you click on those links they all have a canonical link back to http://www.webjobz.com/jobs/industry/Accounting Is my page being punished for this? Do I have to put "no follow" tags on every link I do not want the bots to follow and if I do so is Roger (moz bot) not going to count this as a link?
On-Page Optimization | | Webjobz0 -
Optimal Copy Length
We have a separate page for every product of every bank e.g. http://www.imoney.my/home-loan/citibank The idea is to make it rank for the name of the keyword + the name of the bank. Is the copy long enough for that to happen? We've written one paragraph of unique copy about bank's product information. Shall we write more, or it is enough?
On-Page Optimization | | imoney0 -
To Reduce (pages)... or not to Reduce?
Our site has a large Business Directory with millions of pages. For examples' sake, let's say it's a directory of Restaurants. Each Restaurant has 4 pages on the site, each tied together through a row of tabs across the top of the page: Tab 1 - Basic super 7 info - name, location, contact info Tab 2 - Restaurant menu Tab 3 - Restaurant reviews Tab 4 - Photos of food The Tab 1 page generates 95% of our traffic, and 90% of conversions. The conversion rate on Tab 2 - Tab 4 pages is 6 - 10x greater than Tab 1 conversions. Total Conversions from search queries on menus, reviews and food are 20% higher than are conversions resulting from searches on restaurant name & info alone. We're working with a consultant on a redesign, who wants to consolidate the 4 pages into one. Their advice is to focus on making a better page, featuring all of the content, sacrifice a little organic traffic but make up any losses by improving conversion. My counterpoint is that we shouldn't scrap the Tab 2-4 pages just because they have lower traffic - we should make the pages BETTER. The content we display is thin, and we have plenty of data we could expose to make the pages more robust. By consolidating it will also be hard to optimize a page for people searching for name/location AND menu AND reviews AND photos. We're asking that one page to do too much, and it's likely we will see diminished search volume for queries on menu, reviews and food. I think the decline will be much more significant than the consultant estimates. The consultant says there will be little change to organic traffic. since Tab 1 already generates 95% of traffic. Through basic math, they're saying the risk is a 5% decline in organic traffic. Further, they see little chance of queries for menu, reviews, and food declining because most of those queries tend to send people too the home page or Tab 1 page anyway. Finally, the designer of the new wireframes admitted that potential organic traffic risks were not taken into consideration when they recommended consolidating the pages. I sincerely appreciate your thoughts and consideration! Trisha
On-Page Optimization | | lzhao0 -
On Page Optimization Reports
How is it determined which terms and associated urls are chosen when SEOmoz tracks your On-Page Report Card? I'm receiving a lot of F Grades for terms I'm not really interested in and a lot of terms I'd like to be tracked aren't. Is there a way I can manually choose which terms and pages I'd like to be shown?
On-Page Optimization | | ClaytonKendall0