How to leverage browser cache a specific file
-
Hello all,
I am trying to figure out how to add leverage browser caching to these items.- http://maps.googleapis.com/maps/api/js?v=3.exp&sensor=false&language=en
- http://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js
- http://www.google-analytics.com/analytics.js
Whats hard is I understand the purpose, but unlike a css file, how do you specify an expiration on an actual direct path file?
Any help or link to get help is appreciated.
Chris
-
Well I guess it's what it is
thank you so much for your insight
-
Unfortunately, you can't specify browser caching for 3rd party content, Chris. Those cache specs would have to implemented by the site the content is actually hosted on.
The only way around this would be to store the 3rd party scripts on your own site and reference them locally. Then they would fall under the caching directives your site had set. This is usually a bad idea though, as it means you'll have no way of knowing when those scripts might have been updated.
This is one of those areas where site speed best practices simply can't be applied to 3rd party content. And if it's just a few such resources it's not going to have much effect on page speed anyway. Where possible, try to load these scripts asynchronously (like analytics.js) or at least place them in the footer of site if you can.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain vs Subdirectory - Specific Case: A big blog in a subdomain
Hi. First of all, I love MOZ and learned a lot about SEO by reading articles here. Thanks for all the knowledge that i received here. I read all the articles about "Subdomain vs Subdirectory" in the MOZ community and I have no doubt that subdirectories are the best option for a blog. But, the company that I work now has a blog with more than 17.000 articles, 1.000 categories and tags, hosted on a subdomain structure. The website has a Domain Authority of 78 (I am working to improve these numbers) and the blog subdomain has the same (78). We had 2.7 million hits per month in the blog and 4.5 million hits per month in the site. I am advising the company to change the blog structure to subfolders inside the domain, but I'm finding resistance to the idea, because the amount of work involved in this change is enormous and there is still the fear of losing traffic. My questions are: Is there any risk of losing traffic with the amount of articles we have? What do we probably get if we change the blog structure to subfolders? Could we have increased authority for the domain? More Traffic? How can I explain to my superiors that we would probably have increase traffic for our keywords? Is there any way to prove or test the gains from this change before we run it? Thanks in Advance.
Intermediate & Advanced SEO | | Marcus.Coelho0 -
Website cache has removed
Hi Team, I am facing an issue with cache of the website, despite various r&d I couldn't able to find the solution as code seems to be ok to me. Can any one of you check and let me know why home page and some of the product pages removed from the caching. See here: https://bit.ly/2Kna3PD Appreciate a quick response! Thanks
Intermediate & Advanced SEO | | Devtechexpert0 -
Home page showing some other website in cache
My website (www.kamagrauk.com) is showing www.likeyoursaytoday.com in google cache website domain further redirect to http://kamagrauknow.com/ problem :1) info:kamagrauk.com shows www.likeyoursaytoday.com2) cache:kamagrauk.com shows www.likeyoursaytoday.comwww.likeyoursaytoday.com copied content from kamagraoraljelly.mei already checked done1) changed website hosting (New Virtual private server)2) Uploaded Fresh backup of website 3) Checked header response ( DNS perfect)4) Checked language meta tag (no error)5) fetch function worked fine 6) try to remove url and readded 7) no error in sitemap8) SSL all Ok9) no crawl errorsnothing worked ....... trying to contact www.likeyoursaytoday.com but not responding backToday (23rd feb) www.likeyoursaytoday.com gone down but our cache been replaced http://www.bagnak.com/so it seems google not able to read our page but here i am attaching screen shoot which google sees everything okblocked%20resources.png cache.png crawlerror.png robots%20test.png
Intermediate & Advanced SEO | | Gauravbb1 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
PDF Cached by Google, but not showing as link
The following pdf is cached by google: http://www.sba.gov/sites/default/files/files/REFERRAL%20LIST%20OF%20BOND%20AGENCIES_Florida.pdf However, OpenSiteExplorer is not listing any of the links as found in it. With such an authoritative site, I would think Google would value this, right? None of the sites listed rank well though and OpenSiteExplorer's inability to see the links makes me wonder if Google provides these sites any value at all. Is there any link juice or brand mention value here for Google?
Intermediate & Advanced SEO | | TheDude0 -
Template Files .tpl versus .html files
We sell a large selection of Insulation Products use template files (.tpl) to collect up-to-date information from a server side database file that contains some 2,500 line items. When an HTML (.html) file is requested on the Internet, the 'example.tpl' file is accessed, the latest product and and pricing information is accessed, then presented to the viewer as 'example.html' My question: Can the use of .tpl files negatively impact Search Engine acceptance?
Intermediate & Advanced SEO | | Collie0 -
301 redirect changed googles cached title tags ??
Hi, This is a new one to me ?! I recently added some 301 redirects from pages that I've removed from my site. Most of them just redirect to my home page, whilst a few redirect to appropriate replacement pages. The odd thing is that when I now search my keywords googles serp shows my website with a title that was on some of the old (now removed and redirected) pages. Is this normal? If so, how should I prevent this from happening? What is going on? The only reasons I set up the redirects was to collect any link juice from the old pages and prevent 404s. Should I remove the 301s? I fetched as google and submitted - to see if that updates the tags. (not been indexed yet) Any help would be appreciated. Kind Regards Tony
Intermediate & Advanced SEO | | thephoenix250 -
Have you ever seen this 404 error: 'www.mysite.com/Cached' in GWT?
Google webmaster tools just started showing some strange pages under "not found" crawl errors. www.mysite.com/Cached www.mysite.com/item-na... <--- with the three dots, INSTEAD of www.mysite.com/item-name/ I have just 301'd them for now, but is this a sign of a technical issue? The site is php/sql and I'm doing the URL rewrites/301s etc in .htaccess. Thanks! -Dan EDIT: Also, wanted to add, there is no 'linked to' page.
Intermediate & Advanced SEO | | evolvingSEO0