Leverage browser caching
-
anyone know a good tutorial on how to implement Leverage Browser Caching? Do I need something like cloud flare or can I add meta tags to do this?
-
Spot on that worked a dream... went from 82 to 92 on google page speed test
-
Thats great I check this out today
-
You would actually add it to your htaccess file if you are using a linux server. This should help, http://www.feedthebot.com/pagespeed/leverage-browser-caching.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle images (lazy loading, compressing, caching...) to impact page load and thus SEO?
Hi all, I am looking for a conclusive answer on how to handle images on Wordpress websites. Most of the time we encounter the same problems regarding images. There are several options to make sure that images don't increase page load too much: Page caching and compressing: standard Lazy loading: helps decrease page load time, but Google might not crawl the images so not good for SEO. See this article on Googlebot scrolling. Correct image format (for example WebP): tried it several times and doesn't help much to decrease page load time. What is best practice? Are there standards or preferred options for the image dimensions and quality (max height, width, number of pixels, rectangular or square) before you upload it, also regarding responsiveness? Is it better to use .jpg, .png or WebP? To sum up, what should you do by default to handle images on websites so you can still have a good page speed even with loads of images? Thanks for your answers!
Intermediate & Advanced SEO | | Mat_C0 -
UK version of site showing US Cache and meta description
Hi Fellow Moz'ers We seem to have an issue where some of our UK site is showing meta descriptions from our US site in the serp's and when you check the cache: of the site it's brining up the .com instead of the .co.uk site. example: cache:https://www.tinyme.co.uk/name-labels shows the US site We've checked the href lang tags and they look ok to me (but i'm not an expert) https://www.tinyme.co.uk/name-labels" hreflang="en-gb"/> https://www.tinyme.com/name-labels" hreflang="en-us"/> https://www.tinyme.com.au/name-labels" hreflang="x-default" /> https://www.tinyme.com.au/name-labels" hreflang="en-au"/> We've had a search around and seen people have similar issues, but cant seem to find a definitive solution.
Intermediate & Advanced SEO | | tinyme1 -
Why differents browsers return different search results?
Hi everyone, I don't understand the reason why if I delete cookies, chronology, set anonymous way surfing in Chorme and Safari, I have different results on Google. I tried it from the same pc and at the same time. Searching in google the query "vangogh" the internet site "www.vangogh-creative.it" is shown in the first page in Chrome but not in Safari. I asked in Google webmaster forum, but nobody seems to know the reason of this behavior. Can anyone help me? Thanks in advance. Massimiliano
Intermediate & Advanced SEO | | vanGoGh-creative0 -
Should you increase the caching levels in Cloudfare to speed up the load times?
Caching Level Determine how much of your website's static content you want CloudFlare to cache. Increased caching can speed up page load time.Caching Level Ignore the query string of static content Site: http://www.southernwhitewater.com Determine how much of your website's static content you want CloudFlare to cache. Increased caching can speed up page load time.
Intermediate & Advanced SEO | | VelocityWebsites0 -
Why are some pages indexed but not cached by Google?
The question is simple but I don't understand the answer. I found a webpage that was linking to my personal site. The page was indexed in Google. However, there was no cache option and I received a 404 from Google when I tried using cache:www.thewebpage.com/link/. What exactly does this mean? Also, does it have any negative implication on the SEO value of the link that points to my personal website?
Intermediate & Advanced SEO | | mRELEVANCE0 -
How to leverage browser cache a specific file
Hello all,
Intermediate & Advanced SEO | | asbchris
I am trying to figure out how to add leverage browser caching to these items. http://maps.googleapis.com/maps/api/js?v=3.exp&sensor=false&language=en http://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js http://www.google-analytics.com/analytics.js Whats hard is I understand the purpose, but unlike a css file, how do you specify an expiration on an actual direct path file? Any help or link to get help is appreciated. Chris0 -
Leveraging interest on a popular blog post, with a new, expanded page on the subject...
On a wordpress site, I have one blog post that performs extremely well for Adsense revenue. But the post is getting older and older, and requires me to place some updates into the article from time to time. It's a blog post, but really feels like more of a reference types page (it's about stocks in a particular industry). Now that I see so many people landing on this page through search (#1 for the term), I'm thinking I really should really develop this information further, and make a reference page out of this information and keep it updated, with a link to it from the nav menu. However, I don't know if it will be bad to have both the reference page and the old post page trying to rank for the same keyword term or not? (They won't be duplicate content, the new page will just the same topic rewritten and expanded). Is that something I can get penalized for? I'm getting very good income off of this existing blog post and don't want to mess it up, but I also know that only keeping this info on a post that's getting older and older is not a good long term plan, and I need to pounce on the interest in the subject matter. So, I see these options: 1. Create the new expanded page, and let Google sort it in the SERPs. 2. Create the new page and redirect the old blog post to the new page. That just doesn't seem right to remove access to my old blog post, though. Which of these is the right thing to do, or is there some way I'm not thinking of?
Intermediate & Advanced SEO | | bizzer0 -
Have you ever seen this 404 error: 'www.mysite.com/Cached' in GWT?
Google webmaster tools just started showing some strange pages under "not found" crawl errors. www.mysite.com/Cached www.mysite.com/item-na... <--- with the three dots, INSTEAD of www.mysite.com/item-name/ I have just 301'd them for now, but is this a sign of a technical issue? The site is php/sql and I'm doing the URL rewrites/301s etc in .htaccess. Thanks! -Dan EDIT: Also, wanted to add, there is no 'linked to' page.
Intermediate & Advanced SEO | | evolvingSEO0