My Website Page Speed is not increasing
-
HEY EXPERTS,
My website page speed is not increasing. I used the wp rocket plugin but still, I am facing errors of Reduce unused CSS, Properly size images, and Avoid serving legacy JavaScript to modern browsers. you can see in the image
I used many plugins for speed optimization but still facing errors. I optimized the images manually by using photoshop but still, I am facing the issue of images size.
After Google Core Web Vital Update my website keyword position is down due to slow speed. Please guide me on how I increase the page speed of my website https://karmanwalayfabrics.pk
Thanks
-
A variety of factors can contribute to a website's slow page speed, and addressing them requires a systematic approach. Here are some common reasons why your website's page speed might not be increasing:
- Large and Unoptimized Images:
High-resolution images and graphics can significantly slow down page load times. Make sure your images are properly resized, compressed, and served in the appropriate format (JPEG, PNG, WebP).
- Too Many HTTP Requests:
Each element on a web page, such as images, scripts, and stylesheets, requires a separate HTTP request. Limit the number of elements and use techniques like image sprites, CSS and JavaScript minification, and combining files where possible.
- Unoptimized Code:
Bloated or inefficient HTML, CSS, and JavaScript code can increase load times. Optimize your code by removing unnecessary characters, white spaces, and comments, and consider using asynchronous loading for JavaScript.
- Server Performance:
Slow server response times can significantly impact page speed. Choose a reliable web hosting provider with good server performance, and consider using Content Delivery Networks (CDNs) to distribute content across multiple servers.
- Lack of Browser Caching:
Enable browser caching to allow returning visitors to load your site faster by storing certain elements locally on their devices.
- Render-Blocking Resources:
JavaScript and CSS files that block the rendering of the page can lead to slower load times. Minimize the use of render-blocking resources and use techniques like asynchronous and deferred loading.
7.Redirects and Broken Links:
Excessive redirects and broken links can increase load times and frustrate users. Minimize redirects and regularly check for broken links.
- External Embedded Media:
Embedded media from external sources (videos, social media widgets, etc.) can slow down your site if not optimized properly. Use lazy loading for media and ensure external sources are not causing delays.
- Database and Plugin Overload:
Excessive database queries and numerous plugins can slow down your website. Optimize your database, use efficient plugins, and eliminate those that are not essential.
- Mobile Responsiveness:
A lack of mobile responsiveness can lead to slow loading times on mobile devices. Ensure your website is fully responsive and optimized for various screen sizes.
- Unoptimized Third-Party Scripts:
Third-party scripts, such as analytics trackers and social media plugins, can impact performance. Evaluate the necessity of these scripts and their impact on load times.
- Too Many Ads:
Excessive ads or poorly optimized ad code can slow down your website. Ensure that ads are properly managed and optimized for performance.
To address these issues, you may need to conduct a thorough website audit, use tools like Google PageSpeed Insights, GTmetrix, or Pingdom to identify specific problems, and then implement the necessary optimizations. Remember that improving page speed is an ongoing process, and regularly monitoring and maintaining your website's performance is crucial for a fast and user-friendly experience.
-
Improving your website's speed, especially in the context of Google's Core Web Vitals, can sometimes require a more detailed approach beyond just using plugins. Here's a guide to address the specific issues you mentioned:
Reduce Unused CSS:
- Manual Cleaning: Sometimes plugins or themes may add unnecessary CSS. You can manually review your CSS files to remove any unused styles.
- Use Tools: There are tools like PurgeCSS that can help to remove unused CSS.
- Minify CSS: If not already done, ensure that your CSS is minified. WP Rocket should handle this, but you can double-check.
Properly Size Images:
- Responsive Images: Make sure you're using the srcset attribute on img tags. WordPress generally does this automatically for content added via the block editor.
- Serve Next-gen Formats: Convert images to WebP format. There are plugins like ShortPixel or Imagify that can do this for you.
- Adaptive Images: Use a solution to serve different image sizes based on the visitor's device.
- Critical Images: Only load above-the-fold images initially. Lazy load the rest as the user scrolls.
Avoid Serving Legacy JavaScript to Modern Browsers:
- Use Babel: If you're developing custom themes or plugins, use a tool like Babel to transpile your JavaScript and use the nomodule attribute to serve modern JavaScript to modern browsers.
- Check Plugins and Themes: It's possible one of your plugins or your theme is including legacy JS. It may be worth reaching out to the developers for an update.
Find more tips by the link: https://onilab.com/blog/magento-2-performance-speed-optimization-guide
-
Hello,
If your website page speed is not increasing, then you must use the websitespeedy tool to help identify performance issues and optimize your website speed. Here are some steps you can take:
Run a speed test: Visit websitespeedy.com and enter your website's URL. The tool will analyze your website's speed and provide a report with recommendations for improvement.
Optimize images: Use an image compression tool like TinyPNG or Smush to compress your images and reduce their file size without compromising their quality. Additionally, ensure that the images are in the correct format (JPG, PNG, GIF) and are optimized for the web.
Preload HTML, CSS, and JavaScript: Preloading HTML, CSS, and JavaScript can help reduce the time it takes for your web pages to load. By preloading these files, you can reduce the time it takes for your web pages to be delivered to your users.
Minimize render-blocking resources: Optimize your website's CSS and JavaScript files to reduce the time it takes to render the page.
Use LazyLoad: LazyLoad is a technique that helps reduce the amount of bandwidth needed to load a page by only loading content when it is needed. LazyLoad can help improve your Page Speed Score by reducing the amount of time it takes for content to be delivered to users.
By implementing these optimizations, you should be able to improve the speed and performance of your website.
-
@frazashfaq11 it also seems that you are suffering from slow server initial response time. I would suggest looking at that as a priority too, this can often counteract any work you are doing to optimise the speed of your site.
What hosting are you on? Is it shared hosting or a VPN?
-
Hi! I am SEO specialist at MjSeo. To solve your problem try to:
- Minimise HTTP requests.
Reduce and merge files.
Now that you know how many requests your site makes, you can start reducing that number. The best place to start is with HTML, CSS, and JavaScript files. - use asynchronous loading for CSS and JavaScript files.
- defer the loading of the JavaScript file.
Delaying a file means preventing it from loading until other elements are loaded. If you defer large files such as JavaScript, you ensure that the rest of your content can load without delay. - Minimise the time to the first byte
In addition to the amount of time it takes for your page to fully load, you'll also want to look at the amount of time it takes to start loading. - Reduce server response time.
- One of the most important factors affecting the loading speed of your page is the time it takes for DNS to look up the page.
- DNS, or Domain Name System, is a server with a database of IP addresses and associated host names. When a user enters a URL into their browser, a DNS server translates that URL into an IP address that points to their location on the network.
- Thus, DNS lookup is the process of finding a particular DNS record. You can think of it as your computer looking up a number in the phone book.
- Translated with www.DeepL.com/Translator (free version)
You also can read here to find more useful information.
- Minimise HTTP requests.
-
@frazashfaq11 Hi! I think the Lighthouse output tells you that while you might have resized the images correctly in Photoshop, the width & height attributes aren't added to your image tags in HTML. So what is happening is that the browser can't reserve the actual space for the image upfront as it has to wait for the image to be loaded.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check if my website is penalized ?
Hi all, I just went over some post that my page can get penalized for over optimizing. I realized my page has quite a lot of h1 (6 it had 30) and a lot of "bold" keywords, does the bolding affect the page seo/penalizing the page? the page im talking about it palmislander.com/dumaguete-travel-guide Thanks
Technical SEO | | i3arty0 -
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Redesigned and Migrated Website - Lost Almost All Organic Traffic - Mobile Pages Indexing over Normal Pages
We recently redesigned and migrated our site from www.jmacsupply.com to https://www.jmac.com It has been over 2 weeks since implementing 301 redirects, and we have lost over 90% of our organic traffic. Google seems to be indexing the mobile versions of our pages over our website pages. We hired a designer to redesign the site, and we are confident the code is doing something that is harmful for ranking our website. F or Example: If you google "KEEDEX-K-DS-FLX38" You should see our mobile page ranking: http://www.jmac.com/mobile/Product.aspx?ProductCode=KEEDEX-K-DS-FLX38 but the page that we want ranked (and we think should be, is https://www.jmac.com/Keedex_K_DS_FLX38_p/keedex-k-ds-flx38.htm) That second page isn't even indexed. (When you search for: "site:jmac.com Keedex K-DS-FLX38") We have implemented rel canonical, and rel alternate both ways. What are we doing wrong??? Thank you in advance for any help - it is much appreciated.
Technical SEO | | jmaccom0 -
Results Pages Duplication - What to do?
Hi all, I run a large, well established hotel site which fills a specific niche. Last February we went through a redesign which implemented pagination and lots of PHP / SQL wizzardy. This has left us, however, with a bit of a duplication problem which I'll try my best to explain! Imagine Hotel 1 has a pool, as well as a hot tub. This means that Hotel 1 will be in the search results of both 'Hotels with Pools' and 'Hotels with Hot Tubs', with exactly the same copy, affiliate link and thumbnail picture in the search results. Now imagine this issue occurring hundreds of times across the site and you have our problem, especially since this is a Panda-hit site. We've tried to keep any duplicate content away from our landing pages with some success but it's just all those pesky PHP paginated pages which doing us in (e.g. Hotels/Page-2/?classifications[]263=73491&classifcations[]742=24742 and so on) I'm thinking that we should either a) completely noindex all of the PHP search results or b) move us over to a Javascript platform. Which would you guys recommend? Or is there another solution which I'm overlooking? Any help most appreciated!
Technical SEO | | dooberry0 -
Redirecting website page to another
Hi there one of my old pages on my site is currently ranking for a phrases that I want to rank for on a new page I created. My old page from 1 year ago is ranking for 'Property Management Training' (it's a blog post dating 2011) I have cretaed a new main Page on my site and would like to rank for 'Property Management' as it's more relevant. What is the best suggestion to keep my ranking but have people go to my new page? 301 redirect old page to new page? Thanks,
Technical SEO | | daracreative1 -
Backlinks to home page vs internal page
Hello, What is the point of getting a large amount of backlinks to internal pages of an ecommerce site? Although it would be great to make your articles (for example) strong, isn't it more important to build up the strength of the home page. All of My SEO has had a long term goal of strengthening the home page, with just enough backlinks to internal pages to have balance, which is happening naturally. The home page of our main site is what comes up on tons of our keyword searches since it is so strong. Please let me know why so much effort is put into getting backlinks to internal pages. Thank you,
Technical SEO | | BobGW0 -
Cache my page
So I need to get this page cached: http://www.flowerpetal.com/index.jsp?info=13 It's been 4-5 months since uploaded. Now it's linked to from the homepage of a PR5 site. I've tweeted that link 10 times, facebooked, stumbled, linked to it from other articles and still nothing. And I submitted the url to google twice. Any thoughts? Thanks Tyler
Technical SEO | | tylerfraser0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0