How do I "leverage browser caching"
-
Google is telling me to "leverage browser caching" and put a freshness factor of 1 week on images used on my site. http://www.1stclassdriving.co.uk.
Try as i may I cannot find out how to do this.I'm running two sites - http://www.1stclassdriving.co.uk. on a shared hosting package with Easyspace and http://www.croydondrivingschool.co.uk with Fasthosts, both under windows with asp scripting.
Can anyone point me to any tutorials or guide me as to how I do this please
-
Hi
To Leverage browser caching speeds up your site as it will only cause common resources to be loaded once rather than every time you load a page on your site or revisit the site. In order to leverage browser caching I would suggest editing your web.config file if you are on an IIS server or in the .htaccess file if you are on an Apache server. Both of these will involve you making an entry into either of these files and shouldn't require your hosting company. I regularly work with both, but in particular Apache servers and I often drop the standard code in to leverage this and speed the site up. One other noticeable speed enhancement I would recommend is gzip compression which can be carried out in a similar manner. If you have anymore questions about any of this don't hesitate to give me a shout on here
-
Dear Matt,
I don't get it...
Is the answer in the programming of the site (for example through the web.config) or something to do in the IIS (which means the hosting company should do it...?) ?
Also, caching only helps the second time the same user loads the page doesn't it?
THanks
-
Glad I could help Brian and welcome to the community, by the way
-
Brilliant - works fine - many thanx Matt
-
Hi Brian
I think you might find this helpful - http://stackoverflow.com/questions/6634302/how-to-leverage-browser-caching-at-asp-net-iis-7-5
and...
http://stackoverflow.com/questions/642954/iis7-cache-control
You should be able to do this by making/editing (if you already have one) a web.config file in your root folder, which is explained in the article above.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cache missing on SERP
Hi, My website cache is missing on SERP month ago and its not come back anymore also same on description too. I checked on my coding META Description still there.I no idea about this how to retrieve its back to show on SERP.Any suggestion about this?Thank you.,
On-Page Optimization | | Joycee_psp0 -
Site wide content like "why choose us" just above the footer on every single page
Hi Guys, I know that is not good having any kind of duplicate content on your site, but SEO is above all "competition", so I have to see what my competitor are doing to find the best way to outrank them. So this is my question: is it good or not having site wide content like "why choose us" just above the footer on every single page? At the moment, I can see many - too many - of my client competitors having the "Why choose us" as site wide content above the footer. The funny thing they don't use a couple of sentences, they have placed many words and 10/20 internal links, in other words, they have enough stuff to put down a stand alone page. What do you think: this is just a bad SEO practice or it may work, as I can see so many sites ranking well with this kind of piece of junk on each page. I am not going to recommend this to my client, but as am trying to detail every decision I make showing what the competitors are currently doing, my concern is that my client finds it and therefore will ask to have the same shiny piece of garbage above the footer. Thanks, Pierpaolo
On-Page Optimization | | madcow780 -
Should I "No Index" Certain Pages On My Site?
I have some pages on my site that don't really have any content other than some iframes that are embedded from another site. I thought it would be best to tag the page with a no-index so that search engines would leave the page alone since it has no content as far as the search engine can tell (but does provide value to my site visitors). Is this the proper approach or does it do more harm than good?
On-Page Optimization | | Kyle Eaves0 -
Is it redundant to include a redirect to my canonical domain (www) in my .htaccess file since I already have the correct rel="canonical" in my header?
I've been reading the benefits of each practice, but not found anyone mentioning whether it's really necessary to do both? Personally I try to stay clear of .htaccess rewrites unless it's absolutely necessary, since because I've read they can slow down a website.
On-Page Optimization | | HOPdigital0 -
Should "contact" and "Privacy Policy" pages be no-followed?
I have a few pages like the contact and privacy policy page that I could really care less about as far as whether people visit them, or whether the search engines index them. They also don't have any sort of unique content on them... pretty much duplicates of what you'd probably find on hundreds of other websites. Would it be logical then to just nofollow those pages? I just don't know if maybe there's something hidden that I'm not thinking of. For example, maybe Google wants to see that your website has a privacy policy, and by excluding it, you're actually hurting yourself.
On-Page Optimization | | JABacchetta0 -
What does this mean on first step up setting up a campaign? "Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
I am BRAND new to this, and setting up my first campaign. I choose subdomain, and entered www.pdsaz.com. This is the message I receive: We have detected that the domain www.pdsaz.com and the domain pdsaz.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here.
On-Page Optimization | | cschwartzel0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5