Leverage browser caching Nginx
-
Hi there,
Does anyone have any experience with leverage browser caching on Nginx?
Every time I run the pagespeed test from Google my site comes up with this as a high priority issue that needs to be addressed.
I can see that it is mostly images that doesn't have an expiry date and tried putting the following into my conf file:
location ~* .(jpg|jpeg|gif|css|png|js|ico)$ {
expires max;
}But this results in breaking my page off and all elements where out of place and missing images.
Also tried excluding css and js but hen all images where still missing.
My site is running on Drupal and I use APC for PHP to increase the load time on the site.
Hope somebody might be able to help me out.
-
Man, you have a difficult case... I not sure what is happening, but in my page I used this code:
<filesmatch ".(ico|jpg|jpeg|png|gif|swf|css|js)$"=""></filesmatch> Header set Expires "Sun, 30 Apr 2090 20:00:00 GMT" Header set Last-Modified "wed, 20 fev 2012 09:00:00 GMT"
You should try this. And try to clean your browser cache.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Leveraging "Powered by" and link spam
Hi all, For reference: The SaaS guide to leveraging the "Powered By" tactic. My product is an embeddable widget that customers place on their websites (see example referenced in link above). A lot of my customers have great domain authority (big brands, .gov's etc). I would like to use a "Powered By" link on my widgets to create high quality backlinks. My question is: if I have identical link text (on potentially hundreds) of widgets, will this look like link spam to Google? If so, would setting the link text randomly on each widget to one of a few different phrases (to create some variation) avoid this? Hope this makes sense, thanks in advance.
Technical SEO | | NoorHammad0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
Pages appear fine in browser but 404 error when crawled?
I am working on an eCommerce website that has been written in WordPress with the shop pages in E commerce Plus PHP v6.2.7. All the shop product pages appear to work fine in a browser but 404 errors are returned when the pages are crawled. WMT also returns a 404 error when ‘fetch as Google’ is used. Here is a typical page: http://www.flyingjacket.com/proddetail.php?prod=Hepburn-Jacket Why is this page returning a 404 error when crawled? Please help?
Technical SEO | | Web-Incite0 -
Cached pages still showing on Google
We noticed our QA site showing up on Google so we blocked them in our robot.txt file. We still had an issue with them crawling it so we blocked the site from the public. Now Google is still showing a cached version from the first week in March. Do we just have to wait until they try to re-crawl the site to clear this out or is there a better way to try and get these pages removed from results?
Technical SEO | | aspenchicago0 -
Can Google move up my ranking without caching it ?
Is it possible that my site was last cached in Google on 9 Oct (it was ranking 23 that time) and today on 29th oct its ranking position 3 for a kw though still showing cached on 9 oct. Its not cached since 9 oct, does that mean its not crawled since then? How can Google move up my ranking without caching it ?
Technical SEO | | Personnel_Concept0 -
301 redirect to 1 of 3 locations based on browser languge? Is this ok?
Hi all, I'm taking over a site that has some redirect issues that need addressed and I want to make sure this is done right the first time. The problem: Our current setup starts with us allowing both non-www and www pages. I'll address this with a proper rewrite so all pages will have www. Server info: IIS and runs PHP. The real concern is that we currently run a browser detection for language at the root and then do a 302 redirect to /en, /ge or /fr. There is no page at the www.matchware.com. It's an immediate redirect to a language folder. I'd like to get these to a 301(Permanent) redirect but I'm not sure if a URL can have a 301 redirect that can go to 3 different locations. The site is huge and a site overhaul is not an option anytime soon. Our home page uses this: <%
Technical SEO | | vheilman
lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
real_lang = Left(lang,2)
'Response.Write real_lang
Select case real_lang
case "en"
Response.Redirect "/en"
case "fr"
Response.Redirect "/fr"
case "de"
Response.Redirect "/ge"
case else
Response.Redirect "/en" End Select
%> Here is a header response test. ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ HTTP Request Header Connect to 87.54.60.174 on port 80 ... ok GET / HTTP/1.1[CRLF] Host: www.matchware.com[CRLF] Connection: close[CRLF] User-Agent: Web-sniffer/1.0.37 (+http://web-sniffer.net/)[CRLF] Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7[CRLF] Cache-Control: no-cache[CRLF] Accept-Language: de,en;q=0.7,en-us;q=0.3[CRLF] Referer: http://web-sniffer.net/[CRLF] [CRLF] HTTP Response Header --- --- --- Status: HTTP/1.1 302 Object moved Connection: close Date: Fri, 13 May 2011 14:28:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET Location: /ge Content-Length: 124 Content-Type: text/html Set-Cookie: ASPSESSIONIDQSRBQACT=HABMIHACEMGHEHLLNJPMNGFJ; path=/ Cache-control: private Content (0.12 <acronym title="KibiByte = 1024 Byte">KiB</acronym>) <title></span>Object moved<span class="tag"></title> # Object Moved This object may be found <a< span="">HREF="/ge">here. +++++++++++++++++++++++++++++++++++++++++++++++++++++ To sum it up, I know a 302 is a bad option, but I don't know if a 301 is a real option for us since it can be redirected to 1 of 3 pages? Any suggestions?</a<>1 -
Bing Cache
How can you see what pages are cached by bing. I'm basically looking for these google approaches for bing: cache:domain.com site:domain.com Thanks Tyler
Technical SEO | | tylerfraser1