Implications of extending browser caching for Google?
-
I have been asked to leverage browser caching on a few scripts in our code.
- http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds)
- http://www.google.com/jsapi (1 hour)
- https://www.google-analytics.com/plugins/ua/linkid.js (1 hour)
- https://www.google-analytics.com/analytics.js (2 hours)
- https://www.youtube.com/iframe_api (expiration not specified)
- https://ssl.google-analytics.com/ga.js (2 hours)
The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data.
Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis.
Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you.
There is also the concern that Google recommends not caching outside of their settings.
Any help on this is much appreciated.
Do you see any issues/risks/benefits/etc. to doing this from your perspective?
-
Thanks, this is super helpful
-
You wouldn't disrupt the collection of data, but you would need to run a cron job to keep updating it. It is not recommended that you store Google analytics locally & honestly it would make little difference to your speed and is more trouble than it's worth. Caching is not recommended by Google for a reason.
All though if your page speed is healthy your really have nothing to worry about. If your concern is just trying to get 100/100 on the page tests i have heard that this does the trick:
https://developers.google.com/speed/pagespeed/module/filter-make-google-analytics-async#description
Danny
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google Delay or Graduate SERP Changes?
You can request re-indexing of a single page via Google Search Console. It would seem to me you could use this feature to experiment with on-page changes to see the rank change to determine which changes have the most effect. For the sake of this thread, lets temporarily forget that the relative importance on various on-page factors has already been reverse engineered to a degree so we already have a general idea to som extent. It would seem to me if I were Google, I would introduce either a random delay period, or, temper rank change after reindexing. What I mean by that latter point is say a reindex takes a page from position 20 to 10. If it is 'tempered' so to speak on Day 2 after reindexing it might be at 18, day 5 it's at 16, day 7 it's at 16 until it reaches the actual "real" rank. Both the delay and or the tempering of rank change would make it difficult more difficult to reverse engineer relative importance of on-page factors. OR, does Google realize there are large SEO firms doing SEO over several years for many sites that can examine aggregate data to determine these factors so Google doesn't delay (aka sandbox) or temper rank changes due to manual re-indexing?
Intermediate & Advanced SEO | | Semush0 -
New Website SEO Implications
Hi Moz Community, A client of mine has launched a new website. The new website is well designed, mobile friendly, fast loading and offers a far better UX than the old site. It has similar content but 'less wordy'. The old website was tired, slow, not mobile responsive etc but still ranked well. The domain has marketing leading authority and link metrics. Since the launch, the rankings for virtually every word has plummeted. Even previously ranked #1 words have disappeared to page 3 or 4. New pages have different URLs (301s from the old urls are working fine) and still score the same 98% (using the Moz page optimiser tool). Is it usual to experience some short term pain, or are these rankings drop an indication that something else is missing? My theory is that the new URLs are being treated like new pages, and that those new pages don't have the engagement data which is used for ranking. Thus, despite having the same authority of the old pages, as far as user data is concerned, they are new pages and therefor, not ranking well - yet. That theory would make logical sense but I'm hoping some experts here can help. Any suggestions welcome. Here's a quick checklist of things I have already done: complete 301 redirect list
Intermediate & Advanced SEO | | I.AM.Strategist
New sitemap
Submitted to console
Created internal links from within their large blog
Optimised all the new pages (img alts, H1s etc) Extra info: Platform changed from Wordpress to Expression engine
Target pages now on level 3 not level 2 (extra subfolder used)
Less words used (average word count per page from 400+ to 250) Thanks in advance 🙂0 -
Interest in optimise Google Crawl
Hello, I have an ecommerce site with all pages crawled and indexed by Google. But I have some pages with multiple urls like : www.sitename.com/product-name.html and www.sitename.com/category/product-name.html There is a canonical on all these pages linking to the simplest url (so Google index only one page). So the multiple pages are not indexed, but Google still comes crawling them. My question is : Did I have any interest in avoiding Google to crawl these pages or not ? My point is that Google crawl around 1500 pages a day on my site, but there are only 800 real pages and they are all indexed on Google. There is no particular issue, so is it interesting to make it change ? Thanks
Intermediate & Advanced SEO | | onibi290 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Google Penalty or Not?
One of my sites I work with got this message: http://www.mysite: Unnatural inbound linksJune 27, 2013 Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines. As a result, Google has applied a manual spam action to mysite.com/. There may be other actions on your site or parts of your site. But, when I got to manual actions it says: Manual Actions No manual webspam actions found. -- So which is it??? I have been doing link removal, but now I am confused if I need to do a reconsideration request or not.
Intermediate & Advanced SEO | | netviper0 -
301 redirect changed googles cached title tags ??
Hi, This is a new one to me ?! I recently added some 301 redirects from pages that I've removed from my site. Most of them just redirect to my home page, whilst a few redirect to appropriate replacement pages. The odd thing is that when I now search my keywords googles serp shows my website with a title that was on some of the old (now removed and redirected) pages. Is this normal? If so, how should I prevent this from happening? What is going on? The only reasons I set up the redirects was to collect any link juice from the old pages and prevent 404s. Should I remove the 301s? I fetched as google and submitted - to see if that updates the tags. (not been indexed yet) Any help would be appreciated. Kind Regards Tony
Intermediate & Advanced SEO | | thephoenix250 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0