How to leverage browser cache a specific file
-
Hello all,
I am trying to figure out how to add leverage browser caching to these items.- http://maps.googleapis.com/maps/api/js?v=3.exp&sensor=false&language=en
- http://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js
- http://www.google-analytics.com/analytics.js
Whats hard is I understand the purpose, but unlike a css file, how do you specify an expiration on an actual direct path file?
Any help or link to get help is appreciated.
Chris
-
Well I guess it's what it is
thank you so much for your insight
-
Unfortunately, you can't specify browser caching for 3rd party content, Chris. Those cache specs would have to implemented by the site the content is actually hosted on.
The only way around this would be to store the 3rd party scripts on your own site and reference them locally. Then they would fall under the caching directives your site had set. This is usually a bad idea though, as it means you'll have no way of knowing when those scripts might have been updated.
This is one of those areas where site speed best practices simply can't be applied to 3rd party content. And if it's just a few such resources it's not going to have much effect on page speed anyway. Where possible, try to load these scripts asynchronously (like analytics.js) or at least place them in the footer of site if you can.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Recovering old disallow file?
Hi guys, We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed. Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Htaccess rewrite rule (very specific)
Hello, Awhile back my company changed from http: to https: sitewide (before i started working here). We use a very standard rewrite rule that looks like this: RewriteEngine On
Intermediate & Advanced SEO | | Waismann
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R,L] However, with this rule in place, some http: urls are being redirected with a 302 status code. My question is, can I safely change the above code to look like this: RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R=301,L] to ensure that every redirected is returned with a 301 status code. The only change is in the [R,L] section. Thanks to whomever can help with this. I'm pretty sure its safe but I dont want the site to go down, even for a second, so figured I would ask first.0 -
Is the TTFB for different locations and browsers irrelevant if you are self-hosting?
Please forgive my ignorance on this subject. I have little to no experience with the technical aspects of setting up and running a server. Here is the scenario: We are self-hosted on an Apache server. I have been on the warpath to improve page load speed since the beginning of the year. I have been on this warpath not so much for SEO, but for conversion rate optimization. I recently read the Moz Post "How Website Speed Actually Impacts Search Rankings" and was fascinated by the research regarding TTFB. I forwarded the post to my CEO, who promptly sent me back a contradictory post from Cloudflare on the same topic. Ily Grigorik published a post in Google+ that called Cloudflare's experiment "silly" and said that "TTFB absolutely does matter." I proceeded to begin gathering information on our site's TTFB using data provided by http://webpagetest.org. I documented TTFB for every location and browser in an effort to show that we needed to improve. When I presented this info to my CEO (I am in-house) and IT Director, that both shook their heads and completely dismissed the data and said it was irrelevant because it was measuring something we couldn't control. Ignorant as I am, it seems that Ilya Grigorik, Google's own Web Dev Advocate says it absolutely is something that can be controlled, or at least optimized if you know what you are doing. Can any of you super smart Mozzers help me put the words together to express that TTFB from different locations and for different browsers is something worth paying attention to? Or, perhaps they are right, and it's information I should ignore? Thanks in advance for any and all suggestions! Dana
Intermediate & Advanced SEO | | danatanseo0 -
Implementation of AJAX Crawling Specifications
My URL is: http://www.redfin.com/TX/Austin/8413-Navidad-Dr-78735/home/31224372 We're using Google's AJAX crawling system, per the documentation here. https://developers.google.com/webmasters/ajax-crawling/The example page above requires JavaScript to display content; it includes in the source. We have a lot of pages like this on our site.We expect Google to query us at this URL:http://www.redfin.com/TX/Austin/8413-Navidad-Dr-78735/home/31224372?escaped_fragment=This page renders correctly with JavaScript disabled.Are we doing this correctly? There are some small differences between the escaped_fragment HTML snapshot and the JavaScript-generated content. Will this cause any problems for us?We ask because there was a period of about two months (from October 4th to Dec 29th) during which Google's crawler radically decreased the hits to our escaped_fragment URLs; it's maybe recovering now, but maybe it isn't, and I wanted to be absolutely sure we're doing this correctly.
Intermediate & Advanced SEO | | RyanOD0 -
.htaccess files
I am working with a clients website which has multiple htaccess files (.htaccess , .htaccess.holiding, and .htaccess.live -all in the same directory) My question is how does a server process these files? All 3 files? Currently the domain has 301 redirect showing for the home page to the mobile site (which is a problem) in one of the files (.htaccess but not others) Has anyone come across this before with regard to SEO problems?
Intermediate & Advanced SEO | | OnlineAssetPartners0 -
Are htm files stronger than aspx files?
Hello All, I once read that htm files are considered stronger (SEO wise) than aspx files and I wondered if that is correct. Obviously, I mean the static part of aspx files for example making my about us page in htm and not aspx. Among the advantages of aspx is the usage of a master page (a template) for the design etc. Any thoughts? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
What is better for SEO - local video file or youtube video?
Should I use a video player and upload the videos for my website or should I put my videos at youtube and use youtube player?
Intermediate & Advanced SEO | | Naghirniac0